Monday, January 19, 2009

Are internal clouds bogus?

Two articles in the past week have questioned the legitimacy of one of the industry's favorite new terms: internal cloud computing. I thought it was a good time to weigh in on whether or not internal clouds are legit or just a bunch of vendor marketing hype that data center managers should ignore.

Andrew Conry-Murray at InformationWeek wrote that "there's no such thing as a private cloud." Specifically, Drew said: "I don't think the notion of a private cloud makes any sense. In my mind, a key component of the definition of a cloud is that you, the enterprise, don’t have to host and run the infrastructure. All that stuff is supposed to be out of your hands."

Eric Knorr of InfoWorld had a similar comment, but different reasoning. He heard pitches about internal clouds from vendors and interpreted what he heard as a hypefest from Dell, IBM, and others (I bet he'd throw Cassatt in that category, too, though we haven’t talked with him). Their pitches, he said, reeked of the unachievable, "automagic" IT nirvana he'd heard about for years, but never seen delivered.

In general, I say "amen." Picking up on buzzword-compliant terminology without a way to deliver on it is a waste of everyone's time. But does that mean the "internal cloud" term is a bust? Nope. Here's how we look at the internal cloud debate here at Cassatt:

A cloud by any other name

Drew's issue is not whether or not the idea of more dynamic capabilities is a good one. In fact, he finds some of the technology pitched as "private clouds" to be pretty compelling (enough, he says, to continue following some of the vendors pitching it to him).

Instead, he's drawn a line in the...er, sky and said that if it ain't a service coming from outside your data center, it ain't cloud computing. That's fine. Having clear definitions is important, especially while a given technology or idea is careening up the hype curve.

However, here's what we're seeing: Cassatt has been talking about how our software delivers the benefits of cloud-style architectures using your existing computer infrastructure since its inception in 2003 (arguably sometimes more eloquently than others). However, as Craig Vosburgh noted in his posting here a few weeks back, it's had other names along the way. The problem we've seen is that none of these other terms had a Trout & Ries-style "position" in people's heads. IT ops folks had no category to put us into. In fact, most of the terms ("utility computing," "on-demand computing," "autonomic computing") have been greeted with a profound glazing of eyes at best. And extreme skepticism at worst, often accompanied with "I'm hoping virtualization will do that for me."

The understanding of cloud computing (and the market) is evolving

Within the past few months, however, we have noticed a pretty interesting shift as the cloud hype has accelerated. When we use cloud computing as a reference point in trying to explain the kind of thing our software is capable of, suddenly people have a mental model to discuss the dynamic, on-demand nature of it. They've heard the cloud computing idea. And they like it. And that gives us a starting point for a real discussion.

Of course, they don’t necessarily like everything about cloud computing today. Not everything can be migrated to run in Google or Amazon’s cloud immediately. In fact, we are asked how we can help folks get the benefits of what Drew and others call "real" cloud computing, but to do so for their existing apps and infrastructure.

Another example: we noticed the same change in talking with people in the exhibit halls at the recent VMworld and Gartner Data Center conferences. In previous years, we've struggled a bit making clear to someone who walks up to the Cassatt booth what it is we do and how that might benefit their day-to-day job, data center operations, and bottom line in some way. Often we'd have to hand them some product lit for deeper study.

However, starting at VMworld, we've talked to booth visitors about how we can help them create a cloud-style architecture from their existing IT systems. Click. Not that suddenly everyone can rattle off our product data sheets, but there's a logical starting point. And I don’t think it was just the glowing green swizzle sticks we were giving away.

Even more interesting: customers are coming to the realization that virtualization is not going to solve everything for them. It's just a starting point of the change in store for their approach to IT operations.

So maybe it's a matter of semantics and nomenclature. But it certainly feels to me like a step in the evolution of this market. And these things always do tend to evolve. Example: the same publication, InformationWeek, that ran Drew's article that was skeptical about internal clouds also ran an article by John Foley a few months back, saying that private clouds were, in fact, "taking shape." Last week, John also posted a blog entry about his meeting with our CEO, Bill Coleman, and our discussion about – you guessed it – private clouds. So the evolution of the conversation continues.

Skepticism about what automation can do today

Now, Eric's comments in InfoWorld seem to be less about the term "private" or "internal" cloud and more about the tendency for IT vendors to pitch capabilities and results from their fabulous products that are beyond belief.

He's usually right. Shame on us vendor types. Customers and journalists should be especially careful when promises are coming from big, established vendors who actually have a lot to lose economically from dramatic innovation and optimization in how your data center capital is being used. You do the math: on-demand data center infrastructures mean LESS capital spending, at least in the short term. Don’t tell me the hardware guys at Dell or IBM think that's in their best interest. Appirio made similar observations in their blog predicting the fall of private clouds. (By the way, I disagree with Appirio's premise that organizations that will get benefit from an internal cloud are "few and far between." I'm betting that if you have several hundred servers, you'll benefit from breaking down your application silos and pooling your resources.)

I'll tell Eric, however, that despite how "left at the altar" he might feel about utility-style computing, he shouldn't give up hope. There's real, honest-to-goodness innovation showing up in this space. And, we see actual end users, not just cloud providers, investigating it. For what it's worth, since we started using the "internal cloud" term, we've seen a strong uptick in the number and quality of inquiries about our software. Customers are looking to solve a real problem.

Try a self-funding, incremental test of internal clouds

And, knowing that Eric's reaction is not atypical, we (for one) suggest that customers take a very incremental approach. We suggest picking an app or two (only) for initial evaluation, probably in your dev/test environment. Use that to see what's really doable today.

Then, use the savings from your first, small implementation to fund any further roll-out. That way, if it doesn't save you anything or improve things in IT ops, you don't proceed with any more of an implementation.

But, if (somehow) your test of internal clouds actually delivers, you'll have both the proof and a plan to move ahead with significant leaps and bounds.

No matter what you call it.

(You can register for Cassatt's white paper on how to get started with internal cloud computing here.)

2 comments:

Ryan Nichols said...

Hi Jay-- great post, and thanks for the commentary.

Absolutely agree that companies with hundreds of servers need help managing them.

But the idea that somehow companies can use “private cloud” technology to offer web services similar to Google, Amazon, or salesforce.com will lead to massive disappointment.

Your readers may be interested in our reasoning here....

Jay Fry said...

Hey Ryan--
Yep, good discussion. Companies are certainly not going to be able to just add some magical software and suddenly be the new Amazon. However, the concept that we're working with customers on is to apply the concepts of cloud computing (responding to demand, using only what you need, etc.) to your existing apps and infrastructure inside your data center. The plan we work on with companies is to start small and move a few apps at a time to a dynamic, shared infrastructure of hardware, software, and networks. The thing that makes this appealing (at least, so far!) is that unlike with the (current) external cloud idea, you don't have to re-build your apps to run on a new environment of some sort. Maybe the trick is to think big, but start small. We've seen the most interest from big companies with complex data centers and serious security/compliance requirements. For them, an internal cloud is much more feasible than something outside their four walls.