Monday, December 28, 2009

PG&E’s Bramfitt: data centers can be efficient and sustainable, but we must think beyond our current horizons

Before heading off for a little vacation, I posted the first part of a two-part interview with Pacific Gas & Electric’s Mark Bramfitt. Mark is best known in data center circles (you know, the kind of people that hang around this site and others like it) as the face of the Bay Area utility’s efforts to drive data center energy efficiency through innovative incentive programs. Now that I’m back, I’m posting part 2 of that discussion.

In the first part of the interview, Mark talked about what’s worked well and what hasn’t in PG&E's data center energy efficiency efforts, the impact of the recession on green IT projects, and how cloud computing is impacting the story.

After the interview appeared, Forrester analyst Doug Washburn wondered on Twitter if PG&E might kill the incentive programs Mark had been previously guiding. The thought is a natural one given how integral Mark has been to the program. From my interview, it didn’t sound so far-fetched: Mark himself thought cost pressures on PG&E might cause PG&E to “de-emphasize…some of the industry leadership activities undertaken by PG&E…we may not be able to afford to develop new programs and services if they won’t deliver savings.”

In the final part of the interview, posted below, Mark gets a little philosophical about energy efficiency metrics, highlights what’s really holding back data center efficiency work, and doles out a final bit of advice about creating "inherently efficient" IT infrastructure as he departs PG&E.

Jay Fry, Data Center Dialog: Mark, we’ve talked previously about the fact that a lot of the data center efficiency problem comes down to split incentives – often IT doesn’t pay or even see the power bill and the facilities guy doesn’t have control over the servers that are creating the problem. In addition, facilities is usually disconnected from the business reason for running those servers. How serious is this problem right now? What effective or creative ways have you seen for dealing with it?

Mark Bramfitt, PG&E: While I think we can all cite the split incentive issue as a problem, it’s not the only obstacle to success and it gets perhaps a bit too much air time. As much as I’d like to say that everyone’s focus on energy efficiency is making the split incentive obstacle moot, our experience is that there are only two ways that we get past that.

The first is when the IT management side simply has to get more done with less spend, leading them to think about virtualizing workloads. The second is when a facility runs out of capacity, throwing both IT and facility managers into the same boat, with energy efficiency measures often saving the day.

DCD: Many of the industry analyst groups (the 451 Group, Forrester, Gartner, etc.) have eco-efficient IT practices or focus areas now, a definite change from a few years ago. Is this a sign of progress and how can industry analyst groups be helpful in this process?


Mark Bramfitt: I can’t argue against the focus on energy efficient IT being anything but a good thing, but I’ve identified a big resource gap that hampers the ability of utilities to drive efficiency programs. We rely on engineering consultants who can accurately calculate the energy savings from implementing facility and IT improvements, and even here in the Bay Area, we have a hard time finding firms that have this competency – especially on the IT equipment side.

In my discussions with utilities across the U.S., this is probably the single biggest barrier to program adoption – they can’t find firms who can do the calculations, or resources to appropriately evaluate and review them.

So, I think there’s a real opportunity for IT service providers – companies that both sell solutions and services – to step into this market. I don’t know that analyst firms can do much about that.

DCD: I think it’s fair to say that measurement of the power and ecological impact of data centers has started and really does matter. In the (pre-CA) survey Cassatt did at the beginning of this year on the topic, we found that there were definitely areas of progress, but that old data center habits die hard. Plus, arguments among the various industry and governmental groups like the Green Grid and Energy Star over how and what to measure (PUE, DCiE, EUE) probably don’t help. How do you think people should approach measurement?


Mark Bramfitt: We’ve delivered a consistent message to both customers and to vendors of metering and monitoring systems that we think quantifying energy use can only have a positive impact on driving people to manage their data centers better. Metering and monitoring systems lead people to make simple changes, and can directly measure energy savings in support of utility incentive programs.

We also like that some systems are moving beyond just measurement into control of facility and IT equipment, and to the extent that they can do so, we can provide incentive funding to support implementation.

The philosophical distinctions being made around what metrics are best are understandable – the ones we have now place an emphasis on driving down the “overhead” energy use of cooling and power conditioning equipment, and have nothing to say about IT power load. I believe that the industry should focus on an IT equipment “utilization index” rather than holding out for the ideal efficiency metric, which is probably not conceivable given all of the different IT workloads extant in the marketplace.

DCD: One of the major initiatives you had was working with other utilities and create a coalition trying to push for similar incentives to those PG&E has been offering. I’ve heard Data Center Pulse is helping work on that, too. How would you characterize where we are with this process currently. What’s next?

Mark Bramfitt: The PG&E-sponsored and -led Utility IT Energy Efficiency Coalition [also mentioned in part 1 of Mark’s interview] now has almost 50 members, and I would say it has been a success in its core mission – to provide a forum for utilities to share program models and to discuss opportunities in this market space.

I think it’s time for the Coalition, in whatever form it takes in the future, to expand to offering the IT industry a view of what utilities are offering programs and how to engage with them, as well as a place for IT and other companies to list their competencies.

I’ll be frank, though, in saying that I don’t know whether PG&E will continue to lead this effort, or if we need to think about another way to accomplish this work. I’ve had conversations with a lot of players to see how to maintain the existing effort as well as to extend it into new areas.

DCD: Watching this all from PG&E is certainly different than the perspective of vendors, IT departments, or facilities folks. Any comments on seeing this process through the vantage point you’ve had?

Mark Bramfitt: Utilities primarily look at the data center market as a load growth challenge: how can we provide new energy delivery capacity to a segment that is projected to double every 5 years? There’s already immense national and global competition for locations where 20 or even 100 mW loads can be accommodated, and where customers want to build out facilities in months, not years.

PG&E’s answer to that is to actively work with customers to improve energy efficiency, extending our ability to accommodate new load growth without resorting to energy supply and delivery capacity that is expensive to build and not our best choice from an environmental sustainability perspective.

My larger view is that IT can deliver tremendous environmental benefits as it affects broad swaths of our activities – improving delivery systems, for example, and in my line of work enabling “smart grid” technologies that can improve utility operation and efficiency. But to get there, we need to have IT infrastructure that is inherently efficient and thereby sustainable, and we have a huge opportunity space to make that happen.

DCD: Do you have any advice you’d care to leave everyone with?

Advice? Every major achievement I’ve seen in this space has been due to people expanding their vision and horizons. It’s IT managers taking responsibility for energy costs even if they don’t roll up in their budget. It’s IT companies supporting efficiency measures that might in some ways be at cross-purposes with their primary business objectives. And it’s utilities that know that their mission can’t just be about delivering energy, they need to support customers and communities in new ways.



Thanks, Mark, for taking the time for the interview, and best of luck with the new venture.

Wednesday, December 16, 2009

As Bramfitt departs PG&E, where will the new focus for data center energy efficiency efforts be?

If you were in the audience at the Silicon Valley Leadership Group’s Data Center Energy Efficiency Summit earlier this year, you were probably there (among other things) to hear Mark Bramfitt from Pacific Gas & Electric (PG&E). Mark has been the key figure in the Bay Area utility’s efforts to drive improvements in how data centers are run to cut energy costs for the data center owners and to reduce their burgeoning demand for power.

But, Mark had a surprise for his audience. During his presentation, he announced he was leaving PG&E.

“The reaction from the crowd was impressive…and for good reason,” said John Sheputis, CEO of Fortune Data Centers, in a story by Matt Stansberry at SearchDataCenter. “Mark is an excellent speaker, a very well-known entity in the Valley, and among the most outspoken people I know of regarding the broader engagement opportunities between data centers and electricity providers,” Sheputis said. “No one has done more to fund efficiency programs and award high tech consumers for efficient behavior.”

Mark has stayed on with PG&E for the months since then to help with the transition. Before he moves on to his Next Big Thing at the end of 2009, I thought I’d ask him for his thoughts on a few relevant topics. In the first part of the interview that I’m posting today, Mark talks about what’s worked well and what hasn’t in PG&E's data center energy efficiency efforts, the impact of the recession on green IT projects, and how cloud computing is impacting the story.

Jay Fry, Data Center Dialog: Mark, you’ve become synonymous with PG&E’s data center efficiency work and if the reaction to the announcement that you’ll be leaving that role at the SVLG event is any indication, you’ll be missed. Can you give some perspective on how things have changed in this area during your time on the project at PG&E?

Mark Bramfitt, PG&E: First, I truly appreciate the opportunity to offer some clarity around PG&E’s continued focus on this market, as well as my own plans, as I feel I’ve done something of a disservice to both the IT and utility industry nexus by not being more forthright regarding our plans.

My team and I have treated the data center and IT energy efficiency market as a start-up within PG&E’s much larger program portfolio, and we’ve seen a great growth curve over the past four years – doubling our accomplishments in 2008 compared to 2007, for example. We’ve built an industry-leading portfolio of programs and services, and I expect PG&E will continue to see great engagement from our customers in this space.

That being said, utilities in California are under tremendous pressure to deliver energy efficiency as cost effectively as possible, so some of the industry leadership activities undertaken by PG&E may have to be de-emphasized, and we may not be able to afford to develop new programs and services if they won’t deliver savings.

My personal goal is to see 20 or more utilities follow PG&E’s lead by offering comprehensive efficiency programs for data centers and IT, and I think I can best achieve that through targeted consulting support. I’ve been supporting utilities essentially in my “spare” time, in part through the Utility IT Energy Efficiency Coalition, but there are significant challenges to address in the industry, and I think my full-time focus as a consultant will lead to broader success.

DCD: Why are you leaving PG&E, why now, and what will you be working on?

Mark Bramfitt: It may sound trite, or worse, arrogant, but I want to amplify the accomplishments we’ve made at PG&E over the past few years, using my knowledge and skills to drive better engagement between the utility and IT industries in the coming years. PG&E now has a mature program model that can be executed well in Northern California, so I’d like to spend my time on the bigger challenge of driving nationwide activities that will hopefully yield big results.

DCD: You had some big wins at some big Silicon Valley data centers: NetApp being one of those. Can you talk about what came together to make some of those possible? What should other organizations focus on to get them closer to being able to improve their data center efficiency as well?

Mark Bramfitt:
Our “big hit” projects have all been new construction engagements where PG&E provides financial incentives to help pay for the incremental costs of energy efficiency improvements – for measures like air- or water-side economizers, premium efficiency power conditioning and delivery equipment, and air flow isolation measures.

We certainly think our financial support is a big factor in making these projects work, but my project managers will tell you that the commitment of the project developer/owner is key. The design team has to want to work with the technical resource team PG&E brings to the table, and be open to spending more capital to realize expense savings down the road.

DCD: You made some comments onstage at the Gartner Data Center Conference last year, saying that “It’s been slow going.” Why do you think that’s been, and what was most disappointing to you about this effort?

Mark Bramfitt: I don’t have any real disappointments with how things of gone – we’re just very focused on being as successful as we can possibly be, and we are introspective in thinking about what we could do better.

I’d characterize it this way: we’ve designed ways to support on the order of 25 energy efficiency technologies and measures, absolutely leading the utility industry. We’ve reached out to dozens of VARs and system integrators, all of the major IT firms, every industry group and customer association, made hundreds of presentations, delivered free training and education programs, the list goes on.

What has slowed us down, I think, is that the IT industry and IT managers had essentially no experience with utility efficiency programs three years ago. It simply has taken us far longer than we anticipated to get the utility partnership message out there to the IT community.

DCD: The green IT hype was pretty impressive in late 2007 and early 2008. Then the economic crisis really hit. How has the economic downturn affected interest in energy efficiency projects? Did it get lost in the crisis? My personal take is that it certainly didn’t get as much attention as it would have otherwise. Maybe the recession caused companies to be more practical about energy efficiency topics, but I’m not sure about that. What are your thoughts?

Mark Bramfitt: I don’t see that the message has been lost, but certainly the economy has affected market activity.

PG&E is not seeing the level of new data center construction that we had in ’07 and ’08, but the collocation community tells me demand is exceeding supply by 3-to-1. They just can’t get financing to build new facilities.

On the retrofit side, we’re seeing interest in air flow management measures as the hot spot, perhaps because customers are getting the message that the returns are great, and it is an easy way to extend the life and capacity of existing facilities.

DCD: The other topic that’s taken a big share of the IT and facilities spotlight in the last year has obviously been cloud computing. How do you see the efficiency and cloud computing conversations playing together? Is the cloud discussion helping or hindering the efficiency discussion inside organizations in your opinion?

Mark Bramfitt: I’ve talked to some thought leaders on cloud computing and many seem to think that highlighting the potential energy efficiency advantages of shared services has merit. But with regard to our program delivery, the intersection has really been about how to serve collocation owners and tenants, rather than on the broader topic of migration to cloud services.



Be sure to come back for Part 2 of the interview. We'll cover a few other topics of note, including Mark’s thoughts on the philosophical differences over measurement approaches, the single biggest barrier to data center efficiency program adoption, and even a little bit of parting advice from Mark as he gets ready to leave PG&E.

Monday, December 14, 2009

IT metrics: a snore? Not when cloud computing is involved

Amazon’s recent announcement about the availability of spot pricing for EC2 instances got me thinking a bit about the role measurement is going to play in cloud computing. In fact, I sat in on a few sessions at the Gartner Data Center Conference earlier this month focused exclusively on metrics.

A snore? Actually not.

The increasingly dynamic world of cloud computing is going to require (in fact, completely rely upon) a robust and trustworthy set of metrics for how things are going. So much so, in fact, that I’d expect a bit of a renaissance in this space.


And while Amazon’s market-leading moves point to all sorts of requirements to measure costs, availability, and other requirements relating to their EC2 offering, many IT shops are thinking much more locally: how to get going on their own private compute clouds. IT metrics & measurement are going to get more and more important for those IT shops, too. Getting the right metrics working in a way that helps you make IT decisions is a really important step. Sometimes you’ll be impressed with all sorts of useful data about your infrastructure. Just as often you aren’t going to like what those metrics report back.

Here are a few additional thoughts about using metrics to help manage IT, using the Gartner metrics sessions from Vegas as a jumping off point:

An interesting conundrum for metrics: will they help align IT & business – or make things worse?

Gartner analyst Milind Govekar spent an entire session at Gartner’s conference (“Improving Your IT Business through Metrics”) talking about ways to manage IT, and ways to use metrics to that end. Not that this is new. In fact, one of the sessions that Gartner seems to run every year is something about how misaligned IT and the business are…or how best to get them aligned…or how they are moving closer. But I heard a few interesting wrinkles this time around on how measurement should/could be used:

· Without a certain level of IT operational maturity, investment in infrastructure – and measurement of that infrastructure -- is not going to yield great results. Govekar pointed out that majority of the customers they talk to are about halfway through a journey toward more dynamic, adaptable IT. And that’s about the point that the organizations can see a measureable pay-off from the investments. “That’s where you [the IT department] start to make a difference as a trusted service provider,” Govekar said. So, that also means you should expect the journey up to that point to have some bumps. Several analysts noted a need to improve process maturity to make true IT operations progress. And, of course, Gartner conveniently gives you a way to measure where your particular shop is on that journey.

· Metrics can and should be important, but be careful. Choose wisely. Govekar put it very succinctly in his session when he said,“Be careful with metrics: they drive people to do good things; they drive people to do bad things.” As we (hopefully) learned from Wall Street last year, people optimize around what they are measured and compensated for. Make sure it’s the right thing.

· Organizational patience is key. Govekar listed senior management commitment, persistence, and patience as very important to efforts to use metrics to drive IT change. An organization without an interest in continual improvement is not likely to be able to grapple with what the metrics will reveal – and the cultural changes that are necessary to get to that more dynamic IT they think they want.

· You have to be working at a level that business people will care about in order to have any chance of getting the ear of the CEO. The Gartner conference keynote from Donna Scott and Ronnie Colville focused on this practical truth: if you can’t talk to your business folks about ways that IT can help them improve what they’re trying to do, you’re going to have a difficult path to getting your job done. And, your CIO is not likely to be seen as a full partner in the business. While “technology may be used to help innovation,” said Scott, “people and processes are the backbone of innovation, not technology.” Good point. All those virtual machines may sound cool, but not necessarily to everyone. And whatever stats, facts, and figures you do end up with, don’t expect them to be as meaningful to the business set as they might be to the IT folks.

In one of the more interesting sessions about metrics, however, Gartner began to paint a picture of some of the innovative things you can do with them

Gartner analyst Joseph Baylock had some intriguing thoughts in his keynote about ways to use patterns from your infrastructure & operations data to improve IT’s sensitivity to the business. And, if the fancy brochures I found in the lobby are any indication, it sounds like Gartner is going to be spending much more time and effort thinking this through.

Baylock talked about creating an IT operations model that seeks out patterns, applies them to models, and then adapts accordingly. What could this mean for IT? Probably nothing quite yet. However, for companies with performance-driven cultures, Gartner saw this as a way to move from using lagging indicators for managing IT (and, hence, your business) to using leading indicators.

So, not only will you want to be able to measure what you're using at any given time (from Amazon or whomever else externally – and from your own internal infrastructure), but perhaps you’ll want to start to get smart about sifting through the mounds of data to make good choices.

Sounds to me like it’s worth keeping an eye (or two) open -- and on this topic -- after all.

Tuesday, December 8, 2009

Cloud, the economy, and short-term thinking: highlights from Gartner's Data Center Conference

There was a lot to take in at the Gartner Data Center Conference this year, and unfortunately I had a very abbreviated amount of time actually on the ground in Vegas to do it. What was the most notable thing? In my opinion, the headline was how much Gartner has strengthened its feelings about cloud computing in 12 months.

And not any 12 months, mind you, but 12 months darkened by the worse economic downturn of our lives. Of course, they did talk plenty about the impact the economy has had, especially on the IT operations psyche and approach, but cloud computing won out, hands down.
From what I gathered, Gartner now sees cloud computing (especially private clouds) and the real-time infrastructures that support them as something big customers should take a look at.

This certainly isn’t an earth-shattering change. Last year at this time, Tom Bittman’s keynote was about how the future of IT looked a lot like a private cloud. This year, the keynotes and sessions throughout the week definitely gave the impression that cloud computing has evolved and improved this past year, and Gartner has moved its thinking as well. The cloud computing discussion felt more mainstream, more central to many of the presentations. It certainly played a starring role in more of them.
Private clouds received quite a bit of positive airtime, and a near-unanimous vote of confidence from Gartner analysts – something that I didn’t feel was true at last year’s event. This year, Bittman was one of a larger chorus, rather than one of the few vocal Gartner private cloud advocates.

Dave Cappuccio’s keynote on “Infrastructure & Operations: Charting a Course for the Coming Decade” emphasized Gartner’s enthusiasm around private clouds with this prediction: of the money being spent on cloud computing over the next 3-5 years, “we think private cloud services will be 70-80% of the investment.” That sounds pretty serious to me.

Cloud computing focus soars, even though budgets don’t

The audience in the room for the opening keynotes (polled as usual with handheld devices they scattered throughout the room) considered cloud computing to be the 3rd biggest data center challenge for 2010 – in a world in which they also expect flat IT operations budgets. The No. 1 challenge was “space, power, & cooling” and No. 2 was that perennial issue, “linking IT & business.” In fact, 6% of people in the room put cloud computing as their top funded item for next year (compared with 9% for virtualization). That’s impressive.
Gartner believes the bad economy is forcing short-term thinking on infrastructure & operations projects
But don't think that just because cloud computing progressed that there's nothing dramatically different from last year. There is: the impact of the economy on the behavior of IT operations.
One of the most depressing quotes of the conference was not from Jeffrey Garten of the Yale School of Management (though I thought it might be). He actually had a pretty interesting talk -- in plain English -- about what has gone well and what has gone badly in the attempts to stave off a full global financial meltdown that went into high gear at the end of 2008.
Instead, it was how Donna Scott and Ronni Colville characterized IT’s response.
Is the best offense is a good defense? Really?

Because of the bad economy, according to Scott and Colville, any talk about data center or infrastructure “transformation” is right out the window. Instead, they said, the kinder, gentler term “restructuring” is in vogue. I guess that descriptor implies short-term ROI, which is at least pragmatic, since projects with long-term pay-outs are off the table completely. Organizations, said Scott, are instead switching to fast (OK, rushed) IT restructuring moves now.

So, hooray for the focus on ROI, but the shorter time horizon isn’t necessarily good news. It means less thoughtful analysis and planning, and a focus only on what can be done immediately. The unfortunate result? “A lot of optimization decisions are made in silos,” Scott said, when in fact sharing those efforts (things like moving to shared IT services), could have a much broader impact. Gee, thanks, bad economy.

In fact, many of the comments I heard from analysts gave the distinct impression that IT is (and should be) on the defensive. “Promises of IT transformation,” said Milind Govekar during his talk on IT metrics, “are taking a back seat to immediate goals of cost optimization, increasing quality or reducing risk.” Govekar noted that, in fact, “cloud hype” is a part of the problem; it “threatens IT budgets and control.”

Comments like these from Govekar and others felt a bit like the analysts were hearing (and in some cases recommending) that IT should circle the wagons, implying that cloud computing was threatening the status quo in IT. Well, if all of its promises come true, it certainly could.
But not if IT operations makes the first move. How? I liked the bit of advice I saw come from Carl Claunch’s keynote: pilot everything. Private clouds, public clouds, hybrid clouds. If you want to know how to do it, the only way to learn is to get started.

For the record, I seem to recall recounting some very similar advice from last year’s conference. Nevertheless, from what I saw and heard this year, Gartner’s research and the evidence from attendees show increased interest and adoption in cloud computing and real-time infrastructures, despite hiccups, bumps, and bruises from the Great Recession.

That in and of itself is impressive.