Wednesday, January 27, 2010

EMA’s Mann: enable the ‘responsible cloud’ while allowing cowboys enough rope to experiment

I remember back in the '90s when the World Wide Web was being described as the Wild Wild Web. For me, that phrase always conjured up jarring visions of gunslingers waiting patiently for Netscape Navigator to download page after page of underlined, blue hyperlinks.

Fast forward to 2010 and the next big wave of IT change -- cloud computing -- is being described in much the same way. Andi Mann, vice president at industry analyst group Enterprise Management Associates (EMA), recently authored a report (“The Responsible Cloud”) that used the same type of a frontier metaphor to describe the situation enterprises are finding themselves in with cloud computing today.

Andi's report centered not only on data describing the state of cloud computing in larger enterprises, but also made a big deal about the fact that many key management disciplines are both critical and still not in place for cloud. Both Carl Brooks of SearchCloudComputing and Denise Dubie of Network World wrote good summaries of some of the EMA report’s findings (CA, my employer, by the way, was one of the report’s sponsors).

To dig into what some of the specific survey findings might mean, Andi spent a few moments to do this Data Center Dialog interview. Read on for Andi’s thoughts on the speed of cloud adoption, why he was so surprised about some of EMA’s data about virtualization, respondents’ strong bias against public clouds, and just how you can make the most of those IT cowboys.
Jay Fry, Data Center Dialog: Andi, the EMA Responsible Cloud study you just published has some great data about customers’ current thoughts on some of the most important issues about cloud computing at the moment (hint to readers: it’s not the definition). First, a couple specific questions about your report. You reported that 11% of your surveyed companies are intending to implement cloud computing in the next 12 months. Does that feel like a lot, a meager number, or just about right given where you feel things are at the moment?

Andi Mann, Enterprise Management Associates: Given that we are really still in an early-adopter period with cloud computing, I think 11% seems just right considering we were looking only at mid- to very large-sized enterprises. Of course, there is a lot of hype around, with predictions of 20, 30, 40% or more of enterprises switching to cloud, but these seem over-optimistic to me. Cloud is a fundamental shift in how we build and deliver IT services, and especially with these larger organizations, it would be surprising to see an overwhelming adoption just yet. With still slow economic growth, that would mean enterprises dumping their sunk cost in existing IT, which realistically is simply not going to happen overnight.

DCD: One of the big on-going debates (still, unfortunately) is about the reality of private clouds. Your survey numbers say that they are not only real, but are favored by a significant number of customers (75%). Some industry watchers have called private clouds a stepping stone to the public cloud; others see it an end in-and-of itself. Others call it purely self-serving vendor hype. What is your take on your results – and the debate?
Andi Mann: The results did not surprise me. In fact, it seems to me that a lot of predictions about public cloud are based on impossibly idealistic notions of IT, not on the realities of a day-to-day, in-the-trenches view. Enterprises like the idea of cloud, but are looking for a balance between the ideal and the achievable. So they will look to get the benefits of cloud computing – the economies, the self-service, the agility and flexibility – from their own existing investments.
Over time they will likely branch out into public clouds in some way, but IT never gets rid of anything – mainframes, COBOL, client-server. Heck, I know some very large organizations still running Windows NT and even DOS on mission-critical systems today. I have no doubt it will be the same with cloud – private and public cloud will both simply extend the IT we already have, not replace it.

DCD: I like your comment that cloud computing is “like any ‘new frontier,’” it has “too many cowboys, and not enough sheriffs.” The whole focus of your report is the “responsible” cloud and your prescriptive maturity model for how to get there. What is so “irresponsible” about how many orgs are approaching the cloud at the moment and how do you suggest that get fixed? What’s going to lead to more sheriffs in IT? Will we get there quickly enough?
Andi Mann: The issue I see with many “irresponsible” cloud deployments is that they are lacking important management discipline – monitoring, audit, change detection, data management, and more. As a direct result, they are open to some pretty crippling problems, like data loss, downtime, poor performance, audit failure, and more – which might be fine for a Web 2.0 startup, but is unacceptable for a global financial enterprise.
As cloud use becomes increasingly mission-critical in major organizations, I do believe we will see repercussions for this lack of management, including loss of business, damaged credibility, prosecution, fines and more. When that starts to happen, we see in our research that performance monitoring, event management, reporting, security, automation, problem diagnosis, alerting, and more become more than nice-to-have, but rather mission-critical in their own right.
DCD: Coming from CA, I’m obviously a big proponent of the management disciplines you talk a lot about in your report and how to bring those to the cloud. However, I’ll play devil’s advocate for the moment: isn’t there something good about breaking the rules sometimes to get huge changes accomplished? I’ve heard lots of anecdotal evidence to suggest that cloud computing is following the same adoption trend that has been followed by so many disruptive innovations that have gone before: adopters often doing their adopting by ignoring the rules (developers paying for EC2 instances on their credit cards and the like). If IT were to always stick to the tried-and-true, innovation would never have a chance. How should companies look at those trade-offs?
Andi Mann: That is actually a really good point. It is important to see that “irresponsible cloud” is not always bad per se, or at least not forever. My colleague Julie Craig actually pointed this out in the report, saying that the “cowboys” of IT, just like in the Wild West, have an important role in opening up new frontiers like cloud. With many positive aspects of the pioneering spirit of the Wild West cowboy – adventurous, innovative, self-sufficient – they take risks so that they and others can eventually reap the rewards.
But there is a time and a place for them, and once they start to threaten the integrity of mission-critical systems and data, they must be reined in. Otherwise organizations will start to face data breaches, compliance and audit failures, service outages, staff costs, and more that will damage their business, rather than enable it.
DCD: You talk a bit in your survey about “rogue deployments” of cloud computing. What should even the most process-driven IT department learn from some of these rogue cloud efforts? Is there a good way to pull some of the rogue efforts back into the mainstream that you’ve seen?
Andi Mann: This comes down to gaining visibility into cloud deployments – such as through application mapping, network monitoring, change detection, or simply asking – and then allowing the cowboys enough rope to experiment, to learn what works and what doesn’t, to build skills and knowledge, to fix broken processes and streamline failing ones, and even to make some mistakes, as long as they are not "playing" with mission-critical systems.
Over time, these cowboys can potentially become important contributors in a new cloud group within IT, with a mandate to experiment, and go outside the box, with non-disruptive systems and experimental applications to start with, but over time with mission-critical systems too. Certainly stomping them out is not the answer – instead, by finding out what they are doing and why, you will learn what is broken and how to fix it.

DCD: Your data about virtualization adoption shows VMware at the top, as expected. However, both Microsoft Hyper-V and Oracle VM (with the Sun xVM numbers added in) are not so very far behind. That seems really surprising to me. What’s your take?
Andi Mann: Yes, that was surprising to me too. VMware leading all is not a shock, of course. I have been seeing Hyper-V gaining strength in virtualization too, so I am not too surprised Microsoft is a strong second choice, especially given the massive deployment of Windows and the strong leaning among respondents toward private, on-premise cloud.
However, I am very surprised that Citrix XenServer in particular is behind Oracle VM/Sun xVM (although the gap between them is within the survey’s margin of error). I think Oracle VM or Sun xVM are well behind even other Xen variants as general virtualization platforms. They especially need to grow their limited ecosystem of management support, particularly in essential areas of cloud management like automation and service management, before they are even close to parity with ESX, Hyper-V, or XenServer. Yet with Sun’s stellar pedigree throughout the Internet boom, Oracle’s market dominance in applications, and a formidable stack that runs from hardware all the way up, perhaps it stands to reason that Oracle has a bigger opportunity in cloud than I and many others (including Larry Ellison, at least in public [prior to their recent Sun briefing, anyway -ed.]) previously thought.
DCD: What were the other big shockers for you as you put together the report?

Andi Mann: For me, a few things really stand out. The first is the importance that these respondents put on management. Common wisdom (which frequently is neither) says that cloud is flexible, open, unrestricted, and unencumbered by processes or disciplines like ITIL or ISO or ITSM. However, the reality is the exact opposite, with a majority of organizations actually believing that most management disciplines are very important, with many cited as being even more important to cloud than to traditional IT.

The second big shocker was the impact on maturity that cloud can help to deliver, with the data showing not just a correlation between maturity and cloud deployment, but even suggesting that cloud deployment actually leads to increased IT operational maturity.
Third is the high number of respondents that cited difficulty or cost of implementation and lack of flexibility and agility as their most critical problems in cloud computing. Again, this runs contrary to the common wisdom that cloud is easy to deploy and flexible to run.
Finally, perhaps the biggest shock is the total beatdown that real enterprises give to the idea of public cloud – with 69% of respondents completely rejecting public cloud options, and only 9% choosing to adopt a purely public cloud approach (compared to 46% adopting a purely private cloud approach), it really stands out as a pariah in the choice of deployment models.
DCD: When customers are looking to select cloud computing technologies, your numbers say that an existing relationship with a vendor is the least important of all selection criteria. So, all bets seem to be off with the cloud, eh? Any insights on why?
Andi Mann: This is not entirely unusual. In fact, we have seen the same result in desktop virtualization. There are a lot of things more important than a single supplier or a relationship with an account manager. Ultimately, these are nice to have, but if you cannot ensure uptime, security, performance, or other absolutely critical values, a supplier relationship will not stop you going out of business. So enterprises are rightly focusing on what is really important. Vendor relationships will come into play, but only after all other requirements are satisfied.
So virtualization and other vendors that have aspirations to be cloud technology providers had best make sure they bring their ‘A’ game – they will not get on the team by reputation alone.
DCD: Many surveys about cloud computing have pointed to security, compliance, technology maturity, and even potential vendor lock-in as barriers to success. Interestingly, yours is one of the few (only?) I’ve seen where “human/political issues” tops the list. From my experience, organizations don’t always realize that is going to be a problem, but it ends up being a huge one, so I think you’re spot on. Any color on this and did you hear anything about how organizations address are currently looking to address this hurdle?
Andi Mann: I think a lot of people, including analysts, can forget that IT is a combination of people, process, and technology. We saw in our research into virtualization on multiple occasions that the human issues were actually the biggest hurdles to success – especially skills, resourcing, and time, but also departmental politics – so we included this in our cloud survey as well.
Especially in new technologies, the technology and process can be difficult, but ultimately it seems the people issues are the most difficult to overcome. Again, you can start to address these issues with controlled ‘skunkworks’ – using deployments in non-critical areas not just to build skills – but also to establish repeatable procedures, and to show how to be successful, so you can at least get a head-start in overcoming skills, resourcing, and political issues when the time comes to deploy cloud for critical systems and applications.

Thanks for the additional insights and in-depth commentary on the EMA report and survey results, Andi.
If anyone would like the report, but is not an EMA customer, you can purchase the report here.

Monday, January 11, 2010

CA buys Oblicore: contracts & SLAs are pivotal for cloud computing

One of the most interesting aspects of cloud computing is that the conversation about what’s important for IT to deliver shifts to what’s really important: the service you’re trying to provide.

With the cloud, no longer is it sufficient to simply talk about the ins and outs of your technology stack; in fact, the idea is for many of those underlying details to drop down into the mists of irrelevance.

So far, cloud providers have been pretty careful about sticking their necks out, barely offering any sort of SLA that you can take to the bank. The result? "Best efforts" by the cloud vendors to keep things running, but downtime situations similar to what salesforce.com users faced last week.

However, as cloud computing matures, this has to change. And as it does, being able to track and manage the service level obligations you have with providers – internal and external – will be crucial.

What CA’s acquisition of Oblicore could mean for cloud computing

This is one of the reasons that CA’s just-announced acquisition of service level management innovator Oblicore could be so intriguing: the company has set the bar for managing business-relevant SLAs.

The Oblicore technology is best known for its ability to take technical information about what’s going on in your IT environment and correlate that with the business-level information held in your service level contracts. The company’s name is an appropriate summary: service level “obligations” are at the “core” of your business.

The business-level information is a way for Oblicore’s customers – from CIOs to managers of the IT services you’re buying from external vendors – to get a real handle on what they or their service providers are truly delivering. Oblicore has a bunch of integrations with technical third party tools (including with CA products from our partnership in customer engagements with them, but also with many others from the likes of HP, IBM, BMC, Oracle, SAP, and Microsoft) that can give customers the detailed input needed for comparison. And, adding Oblicore to CA’s existing service management solutions should be a nice complement.

More details and roadmaps will become clear as we work with the Oblicore team (all of whom are joining CA, by the way) on integration plans. But having their contract management capabilities helps CA’s service management capabilities today – and as we broaden our cloud computing offerings – makes a lot of sense going forward.

EMA and Gartner on the importance of service management in the cloud

Lisa Erickson-Harris, research director at analyst firm Enterprise Management Associates (EMA), fleshed this out a bit in the press release:

“EMA believes cloud computing trends will further increase the demand for service-based management including service level and service value management. Customers are demanding solutions like Oblicore that can offer business-oriented service management including details related to the service contract, collaboration during the negotiation process, and analytics to represent service delivery results in a dashboard format. Oblicore's strengths are well-suited to bridge the IT/business gap by capitalizing on existing management data, business-driven service definition and analytics to demonstrate service delivery results at each service lifecycle phase.”

This IT/business gap that Lisa mentions was one of the key points in the keynote that Donna Scott and Ronni Colville discussed in their Gartner Data Center Conference keynote last month in Las Vegas. Their advice for IT? “Until you become a driver in the alignment with the business, you are not going to be critical to the business – nor to the CEO."

“You’ve got to know what you have,” said Colville. “You have to have visibility to your services” in business terms. Scott reiterated the same point: “You have to know what you have and what’s effective” so you can “figure out whether they are the appropriate investments.”

Oblicore’s top-down, business-level approach (as opposed to the bottoms-up, technical details which are useful if translated and given context, but can be useless by themselves) can be a big help here. If you want to manage business value, a contract-based view of service levels is an innovative and effective way to start.

For more on these topics, there are some interesting blog entries on service management and cloud computing at the ca.com community site.

Broader implications for CA’s recent cloud-related acquisitions?

I have always thought of CA as a bit of an acquisition machine. That’s certainly been the company’s reputation (and certainly not always to its benefit). However, this is the first CA acquisition I’ve been part of from the inside, and I’m hoping those on the outside will start to see a strategic thoughtfulness in CA’s recent moves, especially in the area of cloud computing.

To recap some of interesting ones of late, CA has picked up Cassatt, NetQoS, and now Oblicore. Add to this the organic development underway and there’s certainly some stuff to be watching as we move through 2010. (What? There’s organic development at CA? To which I answer: surprise! Yes, indeed, there is.) More comments to come as things happen.

The Oblicore acquisition press release from CA can be found here.

Wednesday, January 6, 2010

Watching cloud computing trends for 2010: the vision, customer reality, & downstream impact

We’ve crossed into a new decade (or not, if you’re a numbers purist), and it seems to be an appropriate time for a little reflection, and maybe a chance to get some feel for where things are headed in the 2010 for IT operations, especially as they look at what cloud computing is going to mean for them.

Last year, I rattled off a Top 10 list of Top 10 lists. This year, I, for one, am suffering from a bit of Year-End Top 10 Prediction Fatigue, so I’ll hold off on that for the moment. Instead, I thought I’d check the stats from this Data Center Dialog blog as a way to get a bead on things that people have been interested in here. That way I’m not just pulling commentary out of thin air. Plus, it’s more scientific that way, right?

So, to mark Data Center Dialog’s a-bit-more-than-one-year anniversary, here's a look back at the most popular posts over the past 6 months. My guess is it gives some indication of what people will be considering for the initial months of 2010 as well.


Beyond definitions: looking for vision…and then practical cloud considerations

It’s probably no surprise that the most popular new post here was also the one that explained a bit about the biggest news story to involve us (now former) Cassatt folks: the acquisition of the Cassatt assets and expertise by CA in June. I provided a bit of commentary on the acquisition just before taking an extended few weeks off in Berlin prior to starting my current gig at CA. It’s not too far-fetched to predict that in the months ahead there will be lots more details to talk about regarding what we’ve all been doing at CA since then. (That’s an easy prediction, for sure.)
Fumble! What not to do at a cloud computing conference – The endlessly repeated ploy of starting panel sessions at cloud computing events with the question, “So what is cloud computing?” finally took its toll on me in November. The result was a bit of a (popular) rant about why the people working on cloud computing need to move on to much more useful questions. At least that’s the only way I’m going to be able to sit through another cloud computing conference.
Judging by the numbers, I think you’re with me.

Not that talking about definitions was bad. 2009 was a year in which the definitions of cloud computing (public/private, internal/external, hybrid, and the like) came into focus as the discussion evolved throughout the year. To prove the point, the most popular entry of the last 6 months was the same entry that was the most popular of the first 6 months of the year: Are internal clouds bogus? That post was followed closely by one that described the shifts in the discussion toward hybrid clouds – and the speed with which the combination of public and private cloud computing was likely to become a reality (answer: it’ll take a bit; there are some missing pieces still). In fact, my highlight blog entry that tracked the evolution of the private cloud from the front row seat I’d had was also a favorite.

So, yes, there was a place for the definitional conversation. But real-world information about what customers are doing now was in great demand (and still is, say the stats). This pragmatism is heartening and it propelled reader interest in the entry I did on the 451 Group’s cloud computing customer panel at their ICE conference, alongside a post from earlier in the year listing actual customer questions that our field team had been getting about private clouds. There’s nothing quite like getting things from the horse’s mouth.

Here’s something that was perhaps part of that same trend about getting in better sync with reality: measurement of what is actually going on in data centers (even when it’s showing a trend toward upholding long-established patterns of inefficiency) was also of interest. I saw that as good news, especially since we also had lots of interest in our post from earlier in 2009 discussing the fact that many data center managers don’t actually know what their servers are doing. The first step to a solution is understanding what the problem is, right?

Notable Data Center Dialog interviews: Steve Hamm of BusinessWeek, Bill Coleman, and Mark Bramfitt

Some of the Data Center Dialog interviews (a feature I started at the beginning of 2009 with Al Gillen of IDC) were a few of the most popular posts in the second half of the year. The most read interview? It was one in which I turned the tables on a member of the so-called mainstream media and interviewed him: BusinessWeek’s Steve Hamm had some interesting insights on Silicon Valley in general. It didn’t hurt that he linked to the interview from his blog, too, of course. Interestingly, he has now done what many journalists are doing out of necessity -- changed careers. Steve noted via Twitter a few weeks back that he’s now at IBM.

Also interesting to our readers were the conversations I published about two folks well-known in the world of IT management talking about their Next Big Things. Bill Coleman, my former CEO, gave his take on where cloud computing is now (just Version 1.0, he said) and what he’s working on after Cassatt. Mark Bramfitt talked about his move from a leading role in PG&E’s data center energy efficiency programs to private industry in a two-part interview just published at the end of December. Both Bill and Mark included some candid thoughts on what’s gone well and not so well in their previous roles.

The longer-term implications of cloud computing
We also saw interest in some of the posts pondering what cloud computing might mean to the industry as a whole. Will it mean less will be spent on IT, or, in fact, help accelerate growth? And what about the oft-noted bogeyman of automation? Will the cloud finally mean that automation takes center stage without being cast as the human-hating Skynet from the Terminator flicks? That topic generated some interest for sure.

And, of course, Twitter…
And, as you might expect, our readers were in alignment with the rest of the industry (world?) in its interest in Twitter in the past few months. I used VMworld as a case study of 7 ways that IT conferences can be improved by using Twitter – and 2 ways it makes them worse. That one seemed especially popular with folks who found us via – you guessed it – Twitter.

So what does this all mean for 2010? I have no idea. But I’d bet a couple of these trends will continue to be important. The discussion around how private, public, and hybrid cloud computing will work will certainly continue. I’m expecting, however, that it becomes more focused around the day-to-day practicalities that end user IT departments need to know.

I’ll do my best to make sure I continue to interview folks of interest in the industry with useful perspectives that will benefit IT operations and those doing big thinking about the many ins and outs of cloud computing.

And, Data Center Dialog will continue to be a place to get a pulse on topics at the forefront of the way data centers – and IT in general -- are being run and managed. As customers continue their focus on cloud computing, this blog will too. Thanks for being part of the dialog.