Monday, December 28, 2009

PG&E’s Bramfitt: data centers can be efficient and sustainable, but we must think beyond our current horizons

Before heading off for a little vacation, I posted the first part of a two-part interview with Pacific Gas & Electric’s Mark Bramfitt. Mark is best known in data center circles (you know, the kind of people that hang around this site and others like it) as the face of the Bay Area utility’s efforts to drive data center energy efficiency through innovative incentive programs. Now that I’m back, I’m posting part 2 of that discussion.

In the first part of the interview, Mark talked about what’s worked well and what hasn’t in PG&E's data center energy efficiency efforts, the impact of the recession on green IT projects, and how cloud computing is impacting the story.

After the interview appeared, Forrester analyst Doug Washburn wondered on Twitter if PG&E might kill the incentive programs Mark had been previously guiding. The thought is a natural one given how integral Mark has been to the program. From my interview, it didn’t sound so far-fetched: Mark himself thought cost pressures on PG&E might cause PG&E to “de-emphasize…some of the industry leadership activities undertaken by PG&E…we may not be able to afford to develop new programs and services if they won’t deliver savings.”

In the final part of the interview, posted below, Mark gets a little philosophical about energy efficiency metrics, highlights what’s really holding back data center efficiency work, and doles out a final bit of advice about creating "inherently efficient" IT infrastructure as he departs PG&E.

Jay Fry, Data Center Dialog: Mark, we’ve talked previously about the fact that a lot of the data center efficiency problem comes down to split incentives – often IT doesn’t pay or even see the power bill and the facilities guy doesn’t have control over the servers that are creating the problem. In addition, facilities is usually disconnected from the business reason for running those servers. How serious is this problem right now? What effective or creative ways have you seen for dealing with it?

Mark Bramfitt, PG&E: While I think we can all cite the split incentive issue as a problem, it’s not the only obstacle to success and it gets perhaps a bit too much air time. As much as I’d like to say that everyone’s focus on energy efficiency is making the split incentive obstacle moot, our experience is that there are only two ways that we get past that.

The first is when the IT management side simply has to get more done with less spend, leading them to think about virtualizing workloads. The second is when a facility runs out of capacity, throwing both IT and facility managers into the same boat, with energy efficiency measures often saving the day.

DCD: Many of the industry analyst groups (the 451 Group, Forrester, Gartner, etc.) have eco-efficient IT practices or focus areas now, a definite change from a few years ago. Is this a sign of progress and how can industry analyst groups be helpful in this process?


Mark Bramfitt: I can’t argue against the focus on energy efficient IT being anything but a good thing, but I’ve identified a big resource gap that hampers the ability of utilities to drive efficiency programs. We rely on engineering consultants who can accurately calculate the energy savings from implementing facility and IT improvements, and even here in the Bay Area, we have a hard time finding firms that have this competency – especially on the IT equipment side.

In my discussions with utilities across the U.S., this is probably the single biggest barrier to program adoption – they can’t find firms who can do the calculations, or resources to appropriately evaluate and review them.

So, I think there’s a real opportunity for IT service providers – companies that both sell solutions and services – to step into this market. I don’t know that analyst firms can do much about that.

DCD: I think it’s fair to say that measurement of the power and ecological impact of data centers has started and really does matter. In the (pre-CA) survey Cassatt did at the beginning of this year on the topic, we found that there were definitely areas of progress, but that old data center habits die hard. Plus, arguments among the various industry and governmental groups like the Green Grid and Energy Star over how and what to measure (PUE, DCiE, EUE) probably don’t help. How do you think people should approach measurement?


Mark Bramfitt: We’ve delivered a consistent message to both customers and to vendors of metering and monitoring systems that we think quantifying energy use can only have a positive impact on driving people to manage their data centers better. Metering and monitoring systems lead people to make simple changes, and can directly measure energy savings in support of utility incentive programs.

We also like that some systems are moving beyond just measurement into control of facility and IT equipment, and to the extent that they can do so, we can provide incentive funding to support implementation.

The philosophical distinctions being made around what metrics are best are understandable – the ones we have now place an emphasis on driving down the “overhead” energy use of cooling and power conditioning equipment, and have nothing to say about IT power load. I believe that the industry should focus on an IT equipment “utilization index” rather than holding out for the ideal efficiency metric, which is probably not conceivable given all of the different IT workloads extant in the marketplace.

DCD: One of the major initiatives you had was working with other utilities and create a coalition trying to push for similar incentives to those PG&E has been offering. I’ve heard Data Center Pulse is helping work on that, too. How would you characterize where we are with this process currently. What’s next?

Mark Bramfitt: The PG&E-sponsored and -led Utility IT Energy Efficiency Coalition [also mentioned in part 1 of Mark’s interview] now has almost 50 members, and I would say it has been a success in its core mission – to provide a forum for utilities to share program models and to discuss opportunities in this market space.

I think it’s time for the Coalition, in whatever form it takes in the future, to expand to offering the IT industry a view of what utilities are offering programs and how to engage with them, as well as a place for IT and other companies to list their competencies.

I’ll be frank, though, in saying that I don’t know whether PG&E will continue to lead this effort, or if we need to think about another way to accomplish this work. I’ve had conversations with a lot of players to see how to maintain the existing effort as well as to extend it into new areas.

DCD: Watching this all from PG&E is certainly different than the perspective of vendors, IT departments, or facilities folks. Any comments on seeing this process through the vantage point you’ve had?

Mark Bramfitt: Utilities primarily look at the data center market as a load growth challenge: how can we provide new energy delivery capacity to a segment that is projected to double every 5 years? There’s already immense national and global competition for locations where 20 or even 100 mW loads can be accommodated, and where customers want to build out facilities in months, not years.

PG&E’s answer to that is to actively work with customers to improve energy efficiency, extending our ability to accommodate new load growth without resorting to energy supply and delivery capacity that is expensive to build and not our best choice from an environmental sustainability perspective.

My larger view is that IT can deliver tremendous environmental benefits as it affects broad swaths of our activities – improving delivery systems, for example, and in my line of work enabling “smart grid” technologies that can improve utility operation and efficiency. But to get there, we need to have IT infrastructure that is inherently efficient and thereby sustainable, and we have a huge opportunity space to make that happen.

DCD: Do you have any advice you’d care to leave everyone with?

Advice? Every major achievement I’ve seen in this space has been due to people expanding their vision and horizons. It’s IT managers taking responsibility for energy costs even if they don’t roll up in their budget. It’s IT companies supporting efficiency measures that might in some ways be at cross-purposes with their primary business objectives. And it’s utilities that know that their mission can’t just be about delivering energy, they need to support customers and communities in new ways.



Thanks, Mark, for taking the time for the interview, and best of luck with the new venture.

Wednesday, December 16, 2009

As Bramfitt departs PG&E, where will the new focus for data center energy efficiency efforts be?

If you were in the audience at the Silicon Valley Leadership Group’s Data Center Energy Efficiency Summit earlier this year, you were probably there (among other things) to hear Mark Bramfitt from Pacific Gas & Electric (PG&E). Mark has been the key figure in the Bay Area utility’s efforts to drive improvements in how data centers are run to cut energy costs for the data center owners and to reduce their burgeoning demand for power.

But, Mark had a surprise for his audience. During his presentation, he announced he was leaving PG&E.

“The reaction from the crowd was impressive…and for good reason,” said John Sheputis, CEO of Fortune Data Centers, in a story by Matt Stansberry at SearchDataCenter. “Mark is an excellent speaker, a very well-known entity in the Valley, and among the most outspoken people I know of regarding the broader engagement opportunities between data centers and electricity providers,” Sheputis said. “No one has done more to fund efficiency programs and award high tech consumers for efficient behavior.”

Mark has stayed on with PG&E for the months since then to help with the transition. Before he moves on to his Next Big Thing at the end of 2009, I thought I’d ask him for his thoughts on a few relevant topics. In the first part of the interview that I’m posting today, Mark talks about what’s worked well and what hasn’t in PG&E's data center energy efficiency efforts, the impact of the recession on green IT projects, and how cloud computing is impacting the story.

Jay Fry, Data Center Dialog: Mark, you’ve become synonymous with PG&E’s data center efficiency work and if the reaction to the announcement that you’ll be leaving that role at the SVLG event is any indication, you’ll be missed. Can you give some perspective on how things have changed in this area during your time on the project at PG&E?

Mark Bramfitt, PG&E: First, I truly appreciate the opportunity to offer some clarity around PG&E’s continued focus on this market, as well as my own plans, as I feel I’ve done something of a disservice to both the IT and utility industry nexus by not being more forthright regarding our plans.

My team and I have treated the data center and IT energy efficiency market as a start-up within PG&E’s much larger program portfolio, and we’ve seen a great growth curve over the past four years – doubling our accomplishments in 2008 compared to 2007, for example. We’ve built an industry-leading portfolio of programs and services, and I expect PG&E will continue to see great engagement from our customers in this space.

That being said, utilities in California are under tremendous pressure to deliver energy efficiency as cost effectively as possible, so some of the industry leadership activities undertaken by PG&E may have to be de-emphasized, and we may not be able to afford to develop new programs and services if they won’t deliver savings.

My personal goal is to see 20 or more utilities follow PG&E’s lead by offering comprehensive efficiency programs for data centers and IT, and I think I can best achieve that through targeted consulting support. I’ve been supporting utilities essentially in my “spare” time, in part through the Utility IT Energy Efficiency Coalition, but there are significant challenges to address in the industry, and I think my full-time focus as a consultant will lead to broader success.

DCD: Why are you leaving PG&E, why now, and what will you be working on?

Mark Bramfitt: It may sound trite, or worse, arrogant, but I want to amplify the accomplishments we’ve made at PG&E over the past few years, using my knowledge and skills to drive better engagement between the utility and IT industries in the coming years. PG&E now has a mature program model that can be executed well in Northern California, so I’d like to spend my time on the bigger challenge of driving nationwide activities that will hopefully yield big results.

DCD: You had some big wins at some big Silicon Valley data centers: NetApp being one of those. Can you talk about what came together to make some of those possible? What should other organizations focus on to get them closer to being able to improve their data center efficiency as well?

Mark Bramfitt:
Our “big hit” projects have all been new construction engagements where PG&E provides financial incentives to help pay for the incremental costs of energy efficiency improvements – for measures like air- or water-side economizers, premium efficiency power conditioning and delivery equipment, and air flow isolation measures.

We certainly think our financial support is a big factor in making these projects work, but my project managers will tell you that the commitment of the project developer/owner is key. The design team has to want to work with the technical resource team PG&E brings to the table, and be open to spending more capital to realize expense savings down the road.

DCD: You made some comments onstage at the Gartner Data Center Conference last year, saying that “It’s been slow going.” Why do you think that’s been, and what was most disappointing to you about this effort?

Mark Bramfitt: I don’t have any real disappointments with how things of gone – we’re just very focused on being as successful as we can possibly be, and we are introspective in thinking about what we could do better.

I’d characterize it this way: we’ve designed ways to support on the order of 25 energy efficiency technologies and measures, absolutely leading the utility industry. We’ve reached out to dozens of VARs and system integrators, all of the major IT firms, every industry group and customer association, made hundreds of presentations, delivered free training and education programs, the list goes on.

What has slowed us down, I think, is that the IT industry and IT managers had essentially no experience with utility efficiency programs three years ago. It simply has taken us far longer than we anticipated to get the utility partnership message out there to the IT community.

DCD: The green IT hype was pretty impressive in late 2007 and early 2008. Then the economic crisis really hit. How has the economic downturn affected interest in energy efficiency projects? Did it get lost in the crisis? My personal take is that it certainly didn’t get as much attention as it would have otherwise. Maybe the recession caused companies to be more practical about energy efficiency topics, but I’m not sure about that. What are your thoughts?

Mark Bramfitt: I don’t see that the message has been lost, but certainly the economy has affected market activity.

PG&E is not seeing the level of new data center construction that we had in ’07 and ’08, but the collocation community tells me demand is exceeding supply by 3-to-1. They just can’t get financing to build new facilities.

On the retrofit side, we’re seeing interest in air flow management measures as the hot spot, perhaps because customers are getting the message that the returns are great, and it is an easy way to extend the life and capacity of existing facilities.

DCD: The other topic that’s taken a big share of the IT and facilities spotlight in the last year has obviously been cloud computing. How do you see the efficiency and cloud computing conversations playing together? Is the cloud discussion helping or hindering the efficiency discussion inside organizations in your opinion?

Mark Bramfitt: I’ve talked to some thought leaders on cloud computing and many seem to think that highlighting the potential energy efficiency advantages of shared services has merit. But with regard to our program delivery, the intersection has really been about how to serve collocation owners and tenants, rather than on the broader topic of migration to cloud services.



Be sure to come back for Part 2 of the interview. We'll cover a few other topics of note, including Mark’s thoughts on the philosophical differences over measurement approaches, the single biggest barrier to data center efficiency program adoption, and even a little bit of parting advice from Mark as he gets ready to leave PG&E.

Monday, December 14, 2009

IT metrics: a snore? Not when cloud computing is involved

Amazon’s recent announcement about the availability of spot pricing for EC2 instances got me thinking a bit about the role measurement is going to play in cloud computing. In fact, I sat in on a few sessions at the Gartner Data Center Conference earlier this month focused exclusively on metrics.

A snore? Actually not.

The increasingly dynamic world of cloud computing is going to require (in fact, completely rely upon) a robust and trustworthy set of metrics for how things are going. So much so, in fact, that I’d expect a bit of a renaissance in this space.


And while Amazon’s market-leading moves point to all sorts of requirements to measure costs, availability, and other requirements relating to their EC2 offering, many IT shops are thinking much more locally: how to get going on their own private compute clouds. IT metrics & measurement are going to get more and more important for those IT shops, too. Getting the right metrics working in a way that helps you make IT decisions is a really important step. Sometimes you’ll be impressed with all sorts of useful data about your infrastructure. Just as often you aren’t going to like what those metrics report back.

Here are a few additional thoughts about using metrics to help manage IT, using the Gartner metrics sessions from Vegas as a jumping off point:

An interesting conundrum for metrics: will they help align IT & business – or make things worse?

Gartner analyst Milind Govekar spent an entire session at Gartner’s conference (“Improving Your IT Business through Metrics”) talking about ways to manage IT, and ways to use metrics to that end. Not that this is new. In fact, one of the sessions that Gartner seems to run every year is something about how misaligned IT and the business are…or how best to get them aligned…or how they are moving closer. But I heard a few interesting wrinkles this time around on how measurement should/could be used:

· Without a certain level of IT operational maturity, investment in infrastructure – and measurement of that infrastructure -- is not going to yield great results. Govekar pointed out that majority of the customers they talk to are about halfway through a journey toward more dynamic, adaptable IT. And that’s about the point that the organizations can see a measureable pay-off from the investments. “That’s where you [the IT department] start to make a difference as a trusted service provider,” Govekar said. So, that also means you should expect the journey up to that point to have some bumps. Several analysts noted a need to improve process maturity to make true IT operations progress. And, of course, Gartner conveniently gives you a way to measure where your particular shop is on that journey.

· Metrics can and should be important, but be careful. Choose wisely. Govekar put it very succinctly in his session when he said,“Be careful with metrics: they drive people to do good things; they drive people to do bad things.” As we (hopefully) learned from Wall Street last year, people optimize around what they are measured and compensated for. Make sure it’s the right thing.

· Organizational patience is key. Govekar listed senior management commitment, persistence, and patience as very important to efforts to use metrics to drive IT change. An organization without an interest in continual improvement is not likely to be able to grapple with what the metrics will reveal – and the cultural changes that are necessary to get to that more dynamic IT they think they want.

· You have to be working at a level that business people will care about in order to have any chance of getting the ear of the CEO. The Gartner conference keynote from Donna Scott and Ronnie Colville focused on this practical truth: if you can’t talk to your business folks about ways that IT can help them improve what they’re trying to do, you’re going to have a difficult path to getting your job done. And, your CIO is not likely to be seen as a full partner in the business. While “technology may be used to help innovation,” said Scott, “people and processes are the backbone of innovation, not technology.” Good point. All those virtual machines may sound cool, but not necessarily to everyone. And whatever stats, facts, and figures you do end up with, don’t expect them to be as meaningful to the business set as they might be to the IT folks.

In one of the more interesting sessions about metrics, however, Gartner began to paint a picture of some of the innovative things you can do with them

Gartner analyst Joseph Baylock had some intriguing thoughts in his keynote about ways to use patterns from your infrastructure & operations data to improve IT’s sensitivity to the business. And, if the fancy brochures I found in the lobby are any indication, it sounds like Gartner is going to be spending much more time and effort thinking this through.

Baylock talked about creating an IT operations model that seeks out patterns, applies them to models, and then adapts accordingly. What could this mean for IT? Probably nothing quite yet. However, for companies with performance-driven cultures, Gartner saw this as a way to move from using lagging indicators for managing IT (and, hence, your business) to using leading indicators.

So, not only will you want to be able to measure what you're using at any given time (from Amazon or whomever else externally – and from your own internal infrastructure), but perhaps you’ll want to start to get smart about sifting through the mounds of data to make good choices.

Sounds to me like it’s worth keeping an eye (or two) open -- and on this topic -- after all.

Tuesday, December 8, 2009

Cloud, the economy, and short-term thinking: highlights from Gartner's Data Center Conference

There was a lot to take in at the Gartner Data Center Conference this year, and unfortunately I had a very abbreviated amount of time actually on the ground in Vegas to do it. What was the most notable thing? In my opinion, the headline was how much Gartner has strengthened its feelings about cloud computing in 12 months.

And not any 12 months, mind you, but 12 months darkened by the worse economic downturn of our lives. Of course, they did talk plenty about the impact the economy has had, especially on the IT operations psyche and approach, but cloud computing won out, hands down.
From what I gathered, Gartner now sees cloud computing (especially private clouds) and the real-time infrastructures that support them as something big customers should take a look at.

This certainly isn’t an earth-shattering change. Last year at this time, Tom Bittman’s keynote was about how the future of IT looked a lot like a private cloud. This year, the keynotes and sessions throughout the week definitely gave the impression that cloud computing has evolved and improved this past year, and Gartner has moved its thinking as well. The cloud computing discussion felt more mainstream, more central to many of the presentations. It certainly played a starring role in more of them.
Private clouds received quite a bit of positive airtime, and a near-unanimous vote of confidence from Gartner analysts – something that I didn’t feel was true at last year’s event. This year, Bittman was one of a larger chorus, rather than one of the few vocal Gartner private cloud advocates.

Dave Cappuccio’s keynote on “Infrastructure & Operations: Charting a Course for the Coming Decade” emphasized Gartner’s enthusiasm around private clouds with this prediction: of the money being spent on cloud computing over the next 3-5 years, “we think private cloud services will be 70-80% of the investment.” That sounds pretty serious to me.

Cloud computing focus soars, even though budgets don’t

The audience in the room for the opening keynotes (polled as usual with handheld devices they scattered throughout the room) considered cloud computing to be the 3rd biggest data center challenge for 2010 – in a world in which they also expect flat IT operations budgets. The No. 1 challenge was “space, power, & cooling” and No. 2 was that perennial issue, “linking IT & business.” In fact, 6% of people in the room put cloud computing as their top funded item for next year (compared with 9% for virtualization). That’s impressive.
Gartner believes the bad economy is forcing short-term thinking on infrastructure & operations projects
But don't think that just because cloud computing progressed that there's nothing dramatically different from last year. There is: the impact of the economy on the behavior of IT operations.
One of the most depressing quotes of the conference was not from Jeffrey Garten of the Yale School of Management (though I thought it might be). He actually had a pretty interesting talk -- in plain English -- about what has gone well and what has gone badly in the attempts to stave off a full global financial meltdown that went into high gear at the end of 2008.
Instead, it was how Donna Scott and Ronni Colville characterized IT’s response.
Is the best offense is a good defense? Really?

Because of the bad economy, according to Scott and Colville, any talk about data center or infrastructure “transformation” is right out the window. Instead, they said, the kinder, gentler term “restructuring” is in vogue. I guess that descriptor implies short-term ROI, which is at least pragmatic, since projects with long-term pay-outs are off the table completely. Organizations, said Scott, are instead switching to fast (OK, rushed) IT restructuring moves now.

So, hooray for the focus on ROI, but the shorter time horizon isn’t necessarily good news. It means less thoughtful analysis and planning, and a focus only on what can be done immediately. The unfortunate result? “A lot of optimization decisions are made in silos,” Scott said, when in fact sharing those efforts (things like moving to shared IT services), could have a much broader impact. Gee, thanks, bad economy.

In fact, many of the comments I heard from analysts gave the distinct impression that IT is (and should be) on the defensive. “Promises of IT transformation,” said Milind Govekar during his talk on IT metrics, “are taking a back seat to immediate goals of cost optimization, increasing quality or reducing risk.” Govekar noted that, in fact, “cloud hype” is a part of the problem; it “threatens IT budgets and control.”

Comments like these from Govekar and others felt a bit like the analysts were hearing (and in some cases recommending) that IT should circle the wagons, implying that cloud computing was threatening the status quo in IT. Well, if all of its promises come true, it certainly could.
But not if IT operations makes the first move. How? I liked the bit of advice I saw come from Carl Claunch’s keynote: pilot everything. Private clouds, public clouds, hybrid clouds. If you want to know how to do it, the only way to learn is to get started.

For the record, I seem to recall recounting some very similar advice from last year’s conference. Nevertheless, from what I saw and heard this year, Gartner’s research and the evidence from attendees show increased interest and adoption in cloud computing and real-time infrastructures, despite hiccups, bumps, and bruises from the Great Recession.

That in and of itself is impressive.

Tuesday, November 24, 2009

Is cloud a turkey for IT vendors? Thinking through its impact on end user IT spending

Amid preparations for Thanksgiving feasts here in the U.S., a couple recent predictions from IT industry experts could leave vendors with much less to be thankful for this year. I’ve seen more than one pundit say that even though cloud computing is the Next Big Thing for IT, it will, in the end, mean that less money is being spent on IT.

Is cloud computing really going to mean doing more with less?

Matt Asay, for example, recently noted a new Goldman Sachs report called “A Paradigm Shift for IT: The Cloud” found that even though CIOs may be “loosening the purse strings on IT spending, IT vendors may want to hold off their celebrations.” Why? “Much of this spending appears to be headed for deflationary forces like cloud computing [and] virtualization.”

At SYS-CON’s Cloud Computing Expo earlier this month, I heard GoGrid’s CEO John Keagey announce an IT vendor’s moment of triumph and death knell in one breath. “The dream of utility computing has arrived,” he said, followed quickly with: “I believe we’re at the height of IT spending right now. You’ll see the IT economy shrink to half its current size.”

And, finally, during a panel at Interop last week reported by Joe McKendrick, AT&T’s Joe Weinman, raised doubts about the sustainability of cloud computing economics, describing a scenario in which they break down as enterprise management requirements come into play. “I’m not sure there are any unit-cost advantages that are sustainable among large enterprises,” he said. He expects adoption of external cloud computing in some areas, and private capabilities for others.

Now that’s not the kind of market that the purveyors of cloud computing hype were likely hoping for. In fact, it looks to be a bit of a, um, turkey. “An economic rebound never looked so dire,” noted Asay. “That’s unless you’re an IT buyer, of course.”

More on Matt’s last comment in a moment. First, where are these comments coming from?

Goldman Sachs expects a big bump from cloud computing…

The dire remarks Asay picked up are interspersed in that pretty straightforward Goldman Sachs report he mentioned that covers the basics of cloud computing. (In fact, a big section of the report is itself entitled “Cloud Computing 101: definition, drivers and challenges,” so you shouldn’t expect big leaps forward in analysis.)

However, there are a couple of interesting observations from the report about the importance of cloud computing and its relation to the IT spending environment:

· Cloud computing is “similar in importance to the transition from mainframes to client-server computing in the 1980s.”
· The new data center approaches and architectures required for cloud computing “are coming at an auspicious time: they are likely to coincide with a meaningful cyclical upturn in IT spending over the next couple of years.” One underlying reason for the uptick: the underinvestment that the bad economy has caused in the past 18 months or so.
· “Our checks suggest that enterprise IT demand has hit an inflection point and is now beginning to improve.”

Hang on, that all sounded a bit positive, at least to my ears. They expect growth from re-architecting systems “upon a foundation of virtualization and increased automation.” Goldman (and Asay) note the typical (read: big) players that will be taking advantage of that.

So this sounds rosy for 2010 and the years just beyond. Goldman says “we expect a meaningful increase in demand in the technologies that enable virtualization and increase the availability and security of external Clouds.”

…and now the “but”:

· But, longer term, cloud computing could actually “drive some headwinds for the IT industry” because they see virtualization (one of the interesting components that can be involved in cloud computing) as a “deflationary technology.” Goldman sees IT spending concentrating into the hands of fewer buyers: mainly cloud providers, hosting vendors, and large enterprises. They think better utilization will mean less pressure to spend, and when they do, it’ll be a buyer’s market.

For SaaS, however, there was less uncertainty. SaaS offerings were seen to be expanding rapidly and leading other “external” cloud services. In fact, it is now an “unstoppable shift” from on-premise to on-demand delivery. Evangelos Simoudis, managing director of Trident Capital, posted his thoughts from the Goldman event that linked with the report, which underscored their SaaS discussion: “Two years ago,” wrote Simoudis, “the discussion was whether the enterprise will adopt any SaaS applications. Today the conversation is about which on-premise applications will be replaced by their SaaS equivalent.”

As always, baby steps to start

For areas of cloud computing other than SaaS, however, two themes emerged from the Goldman report that matched what other reputable reports have been saying. First, “for large enterprises, Cloud Computing is still very early.” And, second, “most industry participants we spoke with expect large enterprises to take baby steps in their adoption of Cloud Computing resources, especially for business-critical activities.”

Goldman highlights cloud management

Additional comments from Simoudis are relevant here in helping analyze what Goldman’s thoughts might mean. “Infrastructure-as-a-Service (IaaS) requires management skills that most IT shops today don’t have,” he said. As a result, it’s easy to see why Goldman also noted that “systems management [is] likely to become a key component of the platform over time.” Allan Leinwand of Panorama Capital, also speaking during the Interop panel I mentioned previously, underscored this management gap. “There’s a gap between what the enterprise is used to doing behind the firewall and what the cloud providers can do.”

Goldman sees traditional systems management players (BMC and CA being two they call out) going to market with more IT automation capabilities for provisioning workloads in both physical and virtual environments, as well as extending capabilities to the public cloud. But hang on to your hats, they also predict an eventual battle for where these capabilities go between systems management and platform vendors such as VMWare and Microsoft. Something to watch, for sure.

“CIOs don’t want a single, ubiquitous stack for cloud computing,” predicts Simoudis, “but the ability to work simultaneously with multiple stacks coming from different vendors.” CA’s certainly betting on that approach. Goldman’s published take on CA is that it (along with others) has cloud computing as a “potential new growth vector.” That’s a comment I agree with, or (obviously) I wouldn’t still be here.

The opposite of deflationary: Cloud as another channel for growth?

So back to the question. Is IT spending headed for a permanent downshift? As enterprises find their way to the cloud, using baby steps or otherwise, we’ll certainly see if this deflationary impact is for real.

Now, I know that this whole cloud computing revolution is supposed to first and foremost be an economic one. As in, good for the economics of running IT. No one said anything about it being good for the IT vendors. Unless, of course, they figure out how to get on the side of the customers (always a good idea, if you ask me).

And if they do figure out how to side with the customers, maybe the cloud is in fact another channel for growth. Someone recently reminded me how early e-commerce over the Internet looked back in 1999 (Black Friday was much different back then – and there wasn’t even a Cyber Monday). At that time, many analysts predicted the demise of the brick-and-mortar retail store. Instead, the Web has become another channel to sell goods; it has given people another channel to sell and buy.

Is that a good way to look at what will happen with cloud computing as well? Or maybe that’s the wrong metaphor.

Well, it’s at least worth a thought between slices of pumpkin pie. Gobble, gobble.

Thursday, November 19, 2009

Salesforce.com and CA: using Force.com for agile and SaaS-y results

I don't normally spend too much time on application development topics, but CA has a bit of interesting SaaS-related news coming out of Dreamforce today that I thought was worth noting. And when an event bills itself as The Cloud Computing Event of the Year, I guess it's something I should at least mention.

Despite salesforce.com CEO Marc Benioff’s marathon keynote on Wednesday, he – and the crowds – were back for more today (albeit a bit behind schedule). That’s where CA CEO John Swainson joined him onstage to call cloud computing the “most profound change” he’s seen in computing in his 33 years in the business. They also announced a partnership between salesforce.com and CA and a nifty new CA SaaS application built on Force.com before Marc handed John a celebratory birthday bottle of Bordeaux (more on that below).

The product John showed off onstage is CA Agile Planner, a SaaS application for planning, tracking, and managing agile software development projects. Those familiar with the how the agile development approach goes can probably pretty quickly pick up on its usefulness in managing backlogs, sprints, and burn-downs. It can be used standalone or in concert with CA’s Clarity PPM tool (something that can give you visibility and control over all of your organization’s development projects, agile or otherwise).

One of the “big guys” begins to use Force.com, plus collaboration with salesforce.com

CA is one of the first (and I’m guessing the largest) major software vendors to develop apps on Force.com. (See, there are some big guys building apps on Force.com.) Benioff highlighted a number of other apps using Force.com onstage at Dreamforce today as well, but only Swainson went home with a bottle of 1954 Chateau d'Yquem Sauternes.

Why the wine? (Other than to say happy 55th birthday, of course.) Maybe it was to highlight a bit of teamwork that's gone on: CA developed the Agile Planner product in collaboration with salesforce.com itself. The CA product team used salesforce.com’s expertise and best practices from their own transition to agile development for the first version.

A community for feedback

That’s all good, and CA is taking that real-world influence a step further, creating something called the CA Agile Community to go with this product. It’s an on-line community of agile development practitioners who can help shape the feature roadmap for CA Agile Planner. Anyone can join, and folks can share ideas for what should be in the product, vote on things others have suggested, and have discussions with others interested in the agile approach.

The community feedback aspect will mean direct input from users and experts can have direct impact on the product. Unlike a lot of what’s been available on salesforce.com’s platform to date, this gives enterprise-level visibility, but with the easy-to-get-started attributes that come from being on the Force.com platform. A nice combo.

If you want to take a look, you can swing by the CA booth at Dreamforce or go to www.ca.com/agile. The press release on the partnership and the rest of the details of today’s announcement are here.

Now, if we could incorporate a way to vote on when Benioff should give attendees a mid-keynote bio break, I think everyone would be happy.

Wednesday, November 11, 2009

Two cloud computing Rorschach tests: 'legacy clouds' and the lock-in lesson

This week's San Francisco Cloud Computing Club gathering was a great place to meet some of the movers and shakers in the cloud computing market (or at least the ones within a short drive of San Francisco). The event's concept was to spend some quality time talking through cloud computing issues with a crowd of people who spend all day thinking about the cloud and working on making it a reality.

James Watters (now at VMware) and crew invited an intriguing crowd of about 35 or so (here’s a Twitter list of attendees). Gary Orenstein snapped some pics. John Furrier (from siliconANGLE) used some magical little device to record the entire multi-hour discussion for posterity. Hopefully the recording bit will turn out to be a good idea. There were a few pithy quotes, for sure.

Raising 2 key questions about cloud computing

As James Urquhart (CNET blogger from Cisco) guided the conversation, I heard two interesting points of contention that divided the room pretty dramatically. It struck me that each comment could actually serve as a Rorschach test of a person's view of how powerful and important the concept of cloud computing actually is.

One question was about where IT has been; the other helps define where we might be able to go. Both say a lot about the approach to cloud computing that you're likely to take.

#1: Is cloud computing compelling enough to change how IT runs existing apps?

I lobbed a question for the group to chew on that went something like this: we're hearing lots about new, pre-integrated stacks of hardware, software, storage, etc., that are intended to be used to create private clouds. These systems are interesting for new, greenfield applications. However, even if these sell like hotcakes, most of an end user organization's IT systems are going to still be using what already exists in their data centers.

The question: is it worth the pain and suffering to try to shift what's already running -- and contributing to your business -- to a private cloud? Should you create a "legacy cloud" from what already exists?

History -- and IT conventional wisdom -- say it's too hard to mess with things that work. If it ain't broke, don't fix it. The disruption of changing things adds IT operations risk, and the IT ops guys are not fans of that. Optimists, however, say that cloud computing (even if it's of the internal, private kind) is too compelling to pass up, and IT ops should make the leap.
One person in the Cloud Club discussion compared it to having to till the land to grow the wheat to make bread, versus suddenly having access to a supermarket that sells ready-made bread. You'd always choose the supermarket, right? Not so, if you're in IT operations.

I side with the optimists, but am realistic about how you'll need to get there. Customer feedback continues to say that a Big Bang approach (send everything to a cloud -- now!) certainly won't work in large companies. On the other hand, an incremental shift, app by app, is a lot more palatable by organizations with large, complex IT systems. And, in the end, there will be some apps that never move. (Last week's post mentioned John Treadway of Unisys -- he did a good presentation on a methodical, incremental approach at Cloud Computing Expo in Santa Clara.)

By the way, the answer to this question is not insignificant: it will determine how large the private cloud market will end up being. And, actually, how large the public cloud market will be as well. If end users are too timid, or unconvinced of cloud computing's benefits, this revolutionary change ends up being an uninteresting blip in the IT timeline.

However, my bet is that this process picks up steam bit by bit and is hard to stop.
#2: Has the industry learned its lesson about lock-in?
The second telling question about cloud computing I heard this week is as old as the industry itself, but with a new, cloudy twist. It's a follow-up to the previous questions in many ways. Tom Bittman of Gartner and many others have talked about the private cloud market as just the opening act for cloud computing. At Cloud Club, we talked about private clouds setting the stage for the ability to move some or all of your computing to an external cloud service somewhere. Making platforms slippery enough to be able to handle the shifting back and forth of workloads led to a discussion about how exactly that can come to be.

And the thing that might keep it from being: vendor lock-in.
One potential future tossed around at Cloud Club by James Urquhart, Rich Miller (of Replicate Technologies), and Tom Mornini (of Engine Yard) was that the eventual normal computing scenario will be one in which there are sets of interoperable clouds that customers will be able to choose from and move between, rapidly and easily. This Intercloud concept, as the Cisco folks call it, requires easy interoperability.

And, it requires that the industry (users and vendors alike) have learned the hard lessons about lock-in. The optimists and advanced technologists claim that we already have (Tom of Engine Yard certainly falls into that camp). If it's true, you'll have a slippery infrastructure supporting applications wherever it is that they need to run.

But skeptics didn't let that pass by without a fight. The concept seems too good to be true, and contradicts the way computing has progressed so far.

The economics of the vendors in many ways work against making it easy to move workloads around. Besides, big end user companies are used to making bets on technology stacks with only a few vendors. Is that no longer needed? Customers would certainly prefer the economics that result from making this possible. Will vendors cede control because they know lock-in is a losing proposition? Have we learned the lesson of lock-in at last?
You can see why this discussion needed comfy chairs, a moderator, and a healthy supply of beer and wine.
And the answer, please…
Obviously, there is no answer yet. There's not enough data one way or the other. The market, as we all know, is just getting going. But don't let that put you off. Why?

I think where you stand on this point is actually pretty intriguing: one position is a pragmatic read of what's come before; the other is an optimistic assessment of what's possible. This is a test of whether you see cloud computing as a fundamental shift or just a new buzzword to be exploited by those providing the old thing.

Of course, there is the possibility that cloud computing changes the game, but those who liked the old rules will continue to try to enforce them (read: making money from lock-in). The newcomers will try to undercut those rules and redefine the game. Innovator's Dilemma anyone?

So, my point in highlighting this subset of the Cloud Club conversations was this: keep an eye on these questions. They are some of more interesting things playing out in the cloud computing space.
And, for as much fun as it was to spend several hours in good-natured arguments over these and other topics in a penthouse in San Francisco, none of the answers will appear magically, nor overnight. We leave that to time, actual customers, and the market to decide.

Friday, November 6, 2009

Fumble! What not to do at a cloud computing conference


Ah, conferences. Attending IT events is one of those traditional fall pastimes in North America and Europe. Maybe because I spend lots of time going to them, I get very particular about how they are run, especially when it comes to what speakers and organizers deliver.

The recent flurry of cloud computing conferences got me thinking that these events have their own special challenges. Since cloud computing is at the top of Gartner’s hype curve (and top of their strategic technologies list for 2010), it’s a topic that a big chunk of the IT world is looking for information about. However, recent conferences on the topic have really fumbled what could have been a much better learning experience for everyone involved.

So, instead of spending a post complaining about SYS-CON and this week's Cloud Computing Expo in Santa Clara (or other recent cloud events for that matter), I figure I'd come up with a list for organizers, speakers, and attendees alike to keep the ball moving forward on the industry conversation about cloud computing. Here are my thoughts, along with a football-season-appropriate commentary (that'll be American football, by the way) about the impact of each of these issues:

Don’t host a panel of vendors and then lob the question “So, what’s your definition of cloud computing?” If you want to have this discussion, set up a Cloud 101 or Definitions track. But be warned: asking this question doesn’t do anything except eat up the clock. If you think you’re helping clarify the topic for the audience, think again. No one is going to agree, and the people onstage are going to try to redefine it to match their biases.

A better conversation starter? "Explain how you helped a customer take a first step toward cloud computing." Or maybe: "What underlying assumptions does your product/solution/service make that customers need to know about?" Or even: "What has been the hardest thing about helping customers move to cloud computing?" But, please, don't make us sit through another round of cloud definition roulette.

[Impact: similar annoyance factor as seeing the same Bud Light commercial for the 12th time. Or hearing the ref say the phrase, "After further review, the play stands," after 20 minutes of unnecessary deliberation.]

Don’t retread well-trodden ground, and at least point us to what's new. With cloud computing evolving so fast, it's especially worthy to note that each conference happens at a different point in time and should have something to add in to the industry understanding of where things are now and where they are going. I felt Agatha Poon of Yankee Group had a couple interesting stats about vertical adoption of cloud computing (healthcare liked it, manufacturing was worried about technology maturity levels, and financial services was very wary of the hype). In general, however, she didn't cover much new ground. I bet she lost some of the people in the audience that she hoped to engage. The EMC speaker, too, missed the opportunity to mention the Vblock, Acadia, and VCE news -- something the audience was probably a bit curious about given the story had just broken.

[This is pretty much equivalent to hearing the ref say, "Offsetting penalties, repeat first down." Do-overs can be a bit of a drag.]

Find the customers. Encourage them to come and speak. I think the whole keynote room at Cloud Computing Expo felt embarrassed by having so few hands pop up whenever a speaker asked, "How many of you are end users?" This is the most glaring item that's been missing at cloud computing events I've been to: real-world stories. At this week's event, EMC's Mike Feinberg asked, "Who has actually developed on a cloud platform?" I saw 2 hands go up. "Who has an account on a cloud computing platform?" A few more hands went up, but not many. Yikes. By the end of the conference, speakers stopped asking the question, knowing they were speaking to a room full of vendors.

newScale's Scott Hammond gave a lively pitch (as he is so good at doing) about what one of their customers was experiencing. He recounted how the customer needed to "blow up that crazy, ugly diagram of how things are done today." What would have been even better? Having the customer there to say that themselves.

Why are end users so timid to show themselves and discuss cloud computing? Maybe it's too early. Maybe they're busy. Maybe we didn't ask in a compelling way to make it worth their time. Or maybe having end users attend is not a necessary part of the conference organizer's business model. Other upcoming conferences will likely do much better on this front than SYS-CON did.

[I'd call this a long throw on third down, dropped by the star receiver, just past the first-down marker. Could have been great...but wasn't. Very unsatisfying. Time to punt.]

When you do have customer speakers, make sure they are willing to be specific about their experiences. It's almost worse to think you're going to get a bunch of useful, in-the-trenches information from a customer speaker, only to listen to a bland summary of the benefits of cloud computing. Jill Tummler Singer, the deputy CIO of the CIA, spoke at Cloud Computing Expo in nothing but generalities until the Q&A session at the end of her talk. Now maybe that's because everything that the CIA is doing with their IT infrastructure is classified, but I doubt it.

The other customer general session speaker I heard was Tony Langenstein from Iowa Health Systems. He recounted the urgent chaos around the June 2008 floods in Iowa that put an IHS data center out of commission and what he and his team did to deal with it. It was a great disaster recovery story, complete with hour-by-hour accounts, descriptions of water gushing through every nook and cranny of his data center, and before-and-after photos. The only problem? When it came to discussing the cloud computing angle (related to his use of storage) as opposed to just the data center DR issues, he wasn't specific at all. All he really said was that it "just worked." Entertaining (especially the "after" photos of fish in odd places), but not as useful as it could have been.

[This is important to get right. Not doing so is kind of like using your first-round draft pick to grab a potential franchise quarterback. And then benching him in favor of the journeyman back-up QB.]

Expert speakers can be from vendors. Just be sure to clearly delineate sales pitches from analytical, use case, or strategic content. Make sure vendors know why they are speaking -- and how to keep their selling instincts bottled up. If you're a vendor creating a pitch, it's easy to tell the difference: if you have slides with product names, it's a sales pitch. If there are slides about what customers are doing, it's probably a pretty good start.

Since a good chunk (if not all) of the presenters at Cloud Computing Expo paid for their slots, it's natural that you get a tendency toward product sales pitches. Unfortunately, that's not what the cloud computing discussion needs right now. While SYS-CON was probably able to extract some good revenue from the conference, pervasive sales pitches make it a lot harder for attendees to find the good content.

Nevertheless, there were a few very engaging, intriguing, and useful sessions that I attended (and probably several more that I missed). Yahoo!'s Dr. Raghu Ramakrishnan led what I heard Jake Kaldenbaugh (@Jakewk) call a "large-scale cloud data management master class" in his Tuesday general session. Dr. Ramakrishnan showed what putting data front and center actually means in the cloud. I'm betting Surendra Reddy's presentation (@sureddy from Yahoo!) was also pretty refreshing (thought I missed that one). I thought CA's own Stephen Elliot weaved in quite a bit of useful and relevant experiences on both virtualization and cloud. John Treadway (@cloudbzz) from Unisys and Peter Nickolov from 3Tera both did a very good job of directing business-level discussions without aggressively pitching their wares (even when pushed to do so).

[The vendor sales pitch thing reminds me of the annoyance of sitting through TV timeouts -- while you are in the stadium where the game's being played. You can't speed it up and it's exasperating to sit through.]

Vendor demos can be useful in showing what is meant. There's definitely a place at conferences for vendor-specific content, including demos. When there's someone taking a new approach or delivering a new offering, an overview of a vendor's solution is actually interesting and useful.

Cloud Computing Expo had a few like this that were worth attending. AppZero did a thrill-a-minute demo (that's sarcasm) of its Virtual Application Appliance technology that explained nicely their potential value. (To be fair, my sarcasm issn't aimed at what AppZero actually showed; it worked as advertised. It's just that, as I learned at Cassatt and BEA, infrastructure demos are tough to make exciting.) CloudSwitch explained (and quasi-demoed) a tricky bit of software indirection: their way of using a public cloud without letting your systems know that they are actually using a public cloud.

[This is kind of like a flea-flicker play. Seeing a good demo of new stuff can be pretty snazzy -- and can go for big gains. If it doesn't work, though, it does seems kind of foolish.]

Enough pontificating already. Conferences should help (somehow) capture practical advice. I heard some of the (few) customers who were attending ask a couple types of questions. First, usually of particular vendors: how does something they had been describing actually work. Sure, it's a product question, but it shows an interest in getting to specifics. The other question I heard was more general: OK, so how do I pitch these approaches we're hearing about to my organization back home? Or, (even more pointedly) assuming I buy into all this cloud computing stuff, where do I start?

[This is like the importance of having a good running game when it's fourth and goal. You want to be able to get the job done in a way that's simple, straightforward, and effective. Done right, this (and said conference) is a touchdown. Done badly, everything else is kind of pointless.]

Extracting value

These last few practical questions by end users are what can make a few days out of the office worth it. Few conferences do that. All said, and despite as negative as I sound here, the Cloud Computing Expo event was better than I expected. I even met a few Twitter followers/followees. In addition, I missed the Cloud Camp that was held the final evening -- certainly a totally different style of event that might even address some of the comments above.

In any case, extracting value from cloud computing events shouldn't be so hard. And if we're going to make progress on cloud computing, we've got to do better. Or at least find a better coach.

Note: For my "as-it-happened" tweets and commentary on Cloud Computing Expo in Santa Clara, take a look back at www.twitter.com/jayfry3 from Mon., Nov. 2 through Wed., Nov. 4.