I caved and joined the revolution this weekend. I bought an iPad.
And while it was very fun to do all the things that a newly minted Apple fan boy does (like downloading the app that turns the iPad into ones like they had on Star Trek: The Next Generation), that was just the beginning. I had yet to try to torment my internal IT department with my demands.
First and foremost: I wanted to use my iPad as part of my normal work day. I'm certainly not the first to want this. The appearance of consumer-purchased devices that employees would like to have (must be!) supported by internal IT is getting an amazing amount of attention. Though not always from IT departments, if they can help it. In addition, it’s not just 1 or 2 folks who want to start using tablets, smartphones, and the like. It’s everyone.
What does “not supported” mean for my iPad?
So, first thing Monday, I tried my luck linking into our IT systems. It started off innocently enough: I easily connected to the in-office wireless network. The first real test was going to be whether I could get my corporate email and calendar.
IT had obviously been through this before; there is a document in place on our help system that explains how to do everything. Unfortunately, it starts like this: "Please check if this iPad was purchased for business purposes or if it was a personal purchase. Note: personal machines are not supported."
Hmmm. That sounded ominous. But, despite being “not supported,” it was really simple to enable email and calendar access. I had to add some security precautions, as you might expect, but it worked. My fingers are crossed that it continues to work, given the help I’m not going to get. And, of course, there are a multitude of our enterprise apps I’m not getting access to.
But I’m satisfied. For now. But not everyone is. And corporations certainly shouldn’t be.
Cloud computing, intriguing mobile devices (and what you can do with them) are ganging up on IT
My process of tormenting IT with my iPad started Monday, but it’s guaranteed to last for a long time. And, as I said, the key issue is that I’m not alone.
People – and, yes, it’s about the people and what they (er, I) want to do – have devices that they love that give them easy, constant access. That should be good. There’s a blurring of the boundary between business and personal that businesses stand to gain from.
Cloud-based resources give organizations a fighting chance to scale sufficiently to keep up with the volume driven by these more-and-more-ubiquitous tablets and smartphones. But management and security are often thought of way too late.
In a piece posted at Forbes.com, Dan Woods, CTO and editor of CITO Research, noted that “the IT monopoly has ended but the need to ensure security, reliability, compliance, and integration has not. Most consumerization efforts are long on choice and short on ways to actually address that fact that IT’s responsibilities to manage the issues have not disappeared.”
Shirking management and security – or leaving it as an afterthought – will not cut it this time around, especially since users don’t think twice about going around the official IT channels, something that those official IT channels really can’t afford to have happen if they are going to get their jobs done.
The train is moving faster than you thought
In a study called “IT Consumers Transform the Enterprise: Are You Ready?” that IDC published a few weeks back (free copy without registration here; CA Technologies was a sponsor), they mention these needs – and the speed they need to be dealt with. “The train is moving faster than you thought. Adoption of public cloud, mobile, and social technologies in business operations has already reached high levels, often driven by ‘stealth IT.’”
IDC noted a “surprisingly high” (and concerning) level of personal and confidential information sharing. While the “consumerization of IT” introduces a bunch of new, innovative services and approaches into the enterprise, it also exposes the org to “business risk, compliance gaps, and security challenges if they are not managed.”
An InfoWorld article by Nancy Gohring noted another IDC study that found that even as more and more people are bringing their own tablets and smartphones to work, IT departments have been “slow to support them and may not even be aware of the trend.” Slow, I understand (given I just bought my first iPad a few days ago); not aware, however, is a recipe for big headaches ahead.
What are those ahead of the train doing to stay ahead?
Not everyone, however, is behind the curve. Part of the IDC survey I mentioned earlier highlighted the top characteristics of leaders in this area – as in, what behaviors are they showing. The leaders are more likely to be those using IaaS, PaaS, and Saas; those who are interacting with customers using their smart mobile devices; those who are concerned about data protection, back-up, and end-to-end user experience. “Businesses that are being proactive about consumer-driven IT are more likely to realize greater benefits from investments made to address the consumerization of IT,” said IDC’s Liam Lahey in a recent blog that summarized their survey findings.
In addition, in Woods’ Forbes article, he pointed out some questions that need asking, many at an application level: “Supporting mobile workers adds a new dimension to every application in a company. Which applications should be supported on mobile devices? How much of each application should be available? When does it make sense to craft custom mobile solutions? How can consumer apps become part of the picture? What is [the] ROI for mobility? How much should be invested[?] None of these questions have obvious answers.” Another post of his has some good suggested approaches for IT.
My CA Technologies colleague Andi Mann did a good job of netting this all out in another recent post: “While a minority of leading organizations already ‘get it’, there is still a massive latent opportunity to establish new game-changing technologies, drive disruptive innovations, build exponential revenues, and beat your competitors.” In other words, having IT bury its head in the sand is going to mean missing some opportunities that don’t come along very often to reshape the competitive landscape.
Especially when you couple the support of these tablets and other mobile devices with the changes coming about with the rise of cloud computing.
Look in the mirror
In the end, says Andi, “it’s all about you! ...The bottom line is that you — as an individual, as a consumer, as an employee, as an IT professional — are responsible for a radical change affecting business, government, and IT. You are both driving this change as a consumer of social, mobile, and cloud applications; and being driven by this change as an IT professional adapting to these new customer behaviors.”
Maybe TIME Magazine wasn’t wrong a few years back when they named You as their Person of the Year (congrats, by the way) with a big mirror-like thing on their front cover. It’s just that the revolution always takes longer than people think, and the results are never quite evenly distributed.
I’m a perfect example. I've been involved in cloud computing for many years, but didn’t join this particular part of the revolution -- the part where I expect flicking my fingers on a piece of glass will get me access to what I want -- until this past weekend.
But I’ll probably be confounding IT departments left and right from now on. Make it so.
Showing posts with label IDC. Show all posts
Showing posts with label IDC. Show all posts
Wednesday, August 3, 2011
Boy, my new iPad and I are demanding a lot from IT -- and we're not alone
Posted by
Jay Fry
at
4:26 PM
Friday, March 13, 2009
Like the Big Dig, ex-IDC analyst John Humphreys believes cloud computing will 'take time'
Posted by
Jay Fry
at
6:05 AM
In the last post, I interviewed John Humphreys, formerly the resident virtualization guru at IDC, now with the virtualization and management group within Citrix. John characterized Citrix as moving beyond criticism that they aren't doing enough with their XenSource acquisition and, in fact, taking the bull by the horns -- offering XenServer for free and focusing on aspects of heterogeneous management.
That first crack at being able to manage diverse types of virtualization in the data center is certainly needed. It's one of the first steps down a path that hasn't been well-trodden so far (and it has been especially ignored by the virtualization vendors themselves). OK, but how might that all fit into the industry conversation around cloud computing? Glad you asked...
Jay Fry, Data Center Dialog: John, I asked your old buddy Al Gillen at IDC how he thought virtualization connected to (and was distinct from) cloud computing. I'd love to get your thoughts on that, too.
John Humphreys, Citrix: I see virtualization as the foundation to any "cloudy" infrastructure -- public or private. In order for any cloud to live up to the promises, it must be able to deliver services that are isolated and abstracted from the underlying infrastructure. Clearly virtualization delivers on both of those requirements in spades!
In my opinion, the opportunity is to build workflow and policies on top of the virtualized infrastructure. Today those workflows are built at the siloed application or VM level, but I believe it will require policies and workflows that exist at the IT service level. To me, execution on this sort of vision will take a long term, multi-year commitment.
DCD: The big virtualization vendors -- Citrix, VMware, and Microsoft -- have all also talked about their cloud computing visions. While I like that VMware talks about both internal and external clouds, they seem to think that everyone will be virtualizing 100% of their servers no questions asked, and that no other virtualization technologies will exist in a data center. That, to me, puts them (and their vision) out of synch with the reality of a customer's actual data center. What's your take on this?
John Humphreys: First and foremost, I agree -- I simply don't see customers going 100% virtual any time soon. There are too many "cultural barriers" or concerns in place to do that.
The second point I'd make is that to me, cloud is still years away from becoming a mainstream reality. Just as a point of comparison, x86 virtualization, measured by VMware, is over 10 years old and now approximately 20% of servers are being virtualized each year. These things take time.
Finally, I'd point out that in the near- to mid-term, the ability to federate between internal and external clouds is a huge hurdle for the industry.
Concepts like "cloudbursting" are appealing but today are architecturally dependent. In addition to the technical ability to move services between internal and external data centers, security and regulatory impacts to cloud are "hazy" at best.
DCD: You've now had a chance to view the virtualization market from two different angles -- as an analyst at IDC and now as an executive working for a vendor in the space. What about the space do you see now that you didn't catch before in your analyst role?
John Humphreys: The move for me has been really eye-opening and educational from a lot of different perspectives. I think the one that most drew me to the role is the level of complexity that vendors must deal with in making any decision or product changes.
In the analyst realm, the focus is exclusively on strategy. When you jump over the fence, the strategy is still critical, but it is the details in the execution of that strategy that ultimately will define the success or failure of any move. That means not only being able to define a successful strategy but being able to communicate it to the organization, getting the sales teams to support the moves, coordinate the infrastructure changes that must occur internally, address supply chain issues, work with partners, etc.
As an analyst, I knew I was only seeing the first piece of the cycle, so I made the move so I could experience "the rest of story."
Being from Boston, I see a metaphor in the Big Dig and the Chunnel projects. Being an analyst is like planning the Chunnel project, while being part of a technology vendor is like planning and executing the Big Dig. The Big Dig planners had to worry about 300 years of infrastructure and needed to put all sorts contingency plans in place to ensure the successful execution. That "be prepared" requirement for flexibility appeals to me.
DCD: What's the most interesting thing that you see going on in this space right now?
John Humphreys: I see some very interesting business models being developed that leverage the cloud computing concept. I believe the industry is on the cusp of seeing a host of new ideas being introduced. And, perhaps contrary to others, I believe the down economy is a perfect incubator as expectations over the near term are lowered, giving these start-ups the opportunity to more fully develop these new and great ideas and business models. I expect we'll start to see the impact of all this innovation in the next 3-5 years.
...
Data center change: A Big Dig?
Thanks to John for the interview. I know the analogy to the Big Dig was something John meant in the context of how you get an organization building a product to do something big -- and how you have to make sure you're taking into account all the existing structures. However, I'm thinking it's a good one for data centers in general making transitions to a new model. Here's what I mean:
Your goal is to change how your data center runs to drastically cut your costs and improve how stuff gets done. There's a lot of infrastructure technology from Citrix, VMware, Cassatt, and a lot of others that can help you: virtualization, policy-based automation, management tools, the works. But there's all your critical infrastructure that's already in place and (mostly) working that you have to be careful not to disrupt. It's a big job to work around it and still make progress. Kinda like they had to do with the Big Dig.
But, hey, I'm not from Boston, so maybe the analogy breaks down for IT projects. I kind of hope so, actually, since cost overruns and big delays certainly aren't what we're all aiming for. In IT, you certainly have a greater ability to do smaller, bounded projects that show real return -- and still make notable, tangible improvements to running your business. Those of you who lived through the Big Dig probably know better than I on how close of a match this is.
On the hybrid public/private cloud capabilities, I think John's on target. The industry conversations about this capability (moving computing from inside your data center, out to the cloud, and then back again) are reaching a fever pitch, but there a few things that have to get solved before this is going to work. (Here's my recent post on hybrid clouds if you want to dive deeper on the topic.) But it's certainly one of the models that IT ops is going to want to have at its disposal in the future.
The approach we're taking at Cassatt is to help people think about how they might do "cloudbursting" by starting the work on creating a cloud internally first. At the same time, customers often begin experimenting with external cloud services. That early experience on both sides of the internal/external cloud divide will be a big help. (And, we've built our software with an eye toward eventually making cloud federation a reality.)
There is one thing I might quibble with John on that I didn't during the interview -- his supposition that virtualization is going to be at the core of any "cloudy" infrastructure. My take is that while the concept of separating the software and applications from the underlying hardware infrastructure is a key concept of creating a more dynamic infrastructure, you can still make headway here without having to virtualize everything.
In fact, we've heard a great deal of interest around the internal cloud computing approach, especially when it leverages what someone already has running in their data center -- physical or virtual. Virtualization can be a useful component, but being 100% virtualized is not a requirement. I was pretty critical of VMware's assumptions around this topic in a previous post. The Citrix approach that John walks through above is definitely describing a more realistic, heterogeneous world, but still has some assumptions you'll want to be careful of if you're looking into it.
So, if you are starting on (or are in the messy middle of) your own data center Big Dig -- exploring virtualization, the cloud, and all the infrastructure impact that those might have -- feel free to leave a comment here on how it's going.
If you can get past the Jersey barriers, of course.
That first crack at being able to manage diverse types of virtualization in the data center is certainly needed. It's one of the first steps down a path that hasn't been well-trodden so far (and it has been especially ignored by the virtualization vendors themselves). OK, but how might that all fit into the industry conversation around cloud computing? Glad you asked...
Jay Fry, Data Center Dialog: John, I asked your old buddy Al Gillen at IDC how he thought virtualization connected to (and was distinct from) cloud computing. I'd love to get your thoughts on that, too.
John Humphreys, Citrix: I see virtualization as the foundation to any "cloudy" infrastructure -- public or private. In order for any cloud to live up to the promises, it must be able to deliver services that are isolated and abstracted from the underlying infrastructure. Clearly virtualization delivers on both of those requirements in spades!
In my opinion, the opportunity is to build workflow and policies on top of the virtualized infrastructure. Today those workflows are built at the siloed application or VM level, but I believe it will require policies and workflows that exist at the IT service level. To me, execution on this sort of vision will take a long term, multi-year commitment.
DCD: The big virtualization vendors -- Citrix, VMware, and Microsoft -- have all also talked about their cloud computing visions. While I like that VMware talks about both internal and external clouds, they seem to think that everyone will be virtualizing 100% of their servers no questions asked, and that no other virtualization technologies will exist in a data center. That, to me, puts them (and their vision) out of synch with the reality of a customer's actual data center. What's your take on this?
John Humphreys: First and foremost, I agree -- I simply don't see customers going 100% virtual any time soon. There are too many "cultural barriers" or concerns in place to do that.
The second point I'd make is that to me, cloud is still years away from becoming a mainstream reality. Just as a point of comparison, x86 virtualization, measured by VMware, is over 10 years old and now approximately 20% of servers are being virtualized each year. These things take time.
Finally, I'd point out that in the near- to mid-term, the ability to federate between internal and external clouds is a huge hurdle for the industry.
Concepts like "cloudbursting" are appealing but today are architecturally dependent. In addition to the technical ability to move services between internal and external data centers, security and regulatory impacts to cloud are "hazy" at best.
DCD: You've now had a chance to view the virtualization market from two different angles -- as an analyst at IDC and now as an executive working for a vendor in the space. What about the space do you see now that you didn't catch before in your analyst role?
John Humphreys: The move for me has been really eye-opening and educational from a lot of different perspectives. I think the one that most drew me to the role is the level of complexity that vendors must deal with in making any decision or product changes.
In the analyst realm, the focus is exclusively on strategy. When you jump over the fence, the strategy is still critical, but it is the details in the execution of that strategy that ultimately will define the success or failure of any move. That means not only being able to define a successful strategy but being able to communicate it to the organization, getting the sales teams to support the moves, coordinate the infrastructure changes that must occur internally, address supply chain issues, work with partners, etc.
As an analyst, I knew I was only seeing the first piece of the cycle, so I made the move so I could experience "the rest of story."
Being from Boston, I see a metaphor in the Big Dig and the Chunnel projects. Being an analyst is like planning the Chunnel project, while being part of a technology vendor is like planning and executing the Big Dig. The Big Dig planners had to worry about 300 years of infrastructure and needed to put all sorts contingency plans in place to ensure the successful execution. That "be prepared" requirement for flexibility appeals to me.
DCD: What's the most interesting thing that you see going on in this space right now?
John Humphreys: I see some very interesting business models being developed that leverage the cloud computing concept. I believe the industry is on the cusp of seeing a host of new ideas being introduced. And, perhaps contrary to others, I believe the down economy is a perfect incubator as expectations over the near term are lowered, giving these start-ups the opportunity to more fully develop these new and great ideas and business models. I expect we'll start to see the impact of all this innovation in the next 3-5 years.
...
Data center change: A Big Dig?
Thanks to John for the interview. I know the analogy to the Big Dig was something John meant in the context of how you get an organization building a product to do something big -- and how you have to make sure you're taking into account all the existing structures. However, I'm thinking it's a good one for data centers in general making transitions to a new model. Here's what I mean:
Your goal is to change how your data center runs to drastically cut your costs and improve how stuff gets done. There's a lot of infrastructure technology from Citrix, VMware, Cassatt, and a lot of others that can help you: virtualization, policy-based automation, management tools, the works. But there's all your critical infrastructure that's already in place and (mostly) working that you have to be careful not to disrupt. It's a big job to work around it and still make progress. Kinda like they had to do with the Big Dig.
But, hey, I'm not from Boston, so maybe the analogy breaks down for IT projects. I kind of hope so, actually, since cost overruns and big delays certainly aren't what we're all aiming for. In IT, you certainly have a greater ability to do smaller, bounded projects that show real return -- and still make notable, tangible improvements to running your business. Those of you who lived through the Big Dig probably know better than I on how close of a match this is.
On the hybrid public/private cloud capabilities, I think John's on target. The industry conversations about this capability (moving computing from inside your data center, out to the cloud, and then back again) are reaching a fever pitch, but there a few things that have to get solved before this is going to work. (Here's my recent post on hybrid clouds if you want to dive deeper on the topic.) But it's certainly one of the models that IT ops is going to want to have at its disposal in the future.
The approach we're taking at Cassatt is to help people think about how they might do "cloudbursting" by starting the work on creating a cloud internally first. At the same time, customers often begin experimenting with external cloud services. That early experience on both sides of the internal/external cloud divide will be a big help. (And, we've built our software with an eye toward eventually making cloud federation a reality.)
There is one thing I might quibble with John on that I didn't during the interview -- his supposition that virtualization is going to be at the core of any "cloudy" infrastructure. My take is that while the concept of separating the software and applications from the underlying hardware infrastructure is a key concept of creating a more dynamic infrastructure, you can still make headway here without having to virtualize everything.
In fact, we've heard a great deal of interest around the internal cloud computing approach, especially when it leverages what someone already has running in their data center -- physical or virtual. Virtualization can be a useful component, but being 100% virtualized is not a requirement. I was pretty critical of VMware's assumptions around this topic in a previous post. The Citrix approach that John walks through above is definitely describing a more realistic, heterogeneous world, but still has some assumptions you'll want to be careful of if you're looking into it.
So, if you are starting on (or are in the messy middle of) your own data center Big Dig -- exploring virtualization, the cloud, and all the infrastructure impact that those might have -- feel free to leave a comment here on how it's going.
If you can get past the Jersey barriers, of course.
Wednesday, March 11, 2009
John Humphreys, now at Citrix, sees virtualization competition shifting to management
Posted by
Jay Fry
at
10:35 PM
I'm sure when John Humphreys left IDC and joined Citrix last year he had to endure lots of barbs about joining "the dark side" from his analyst compatriots. Of course, his new vendor friends were probably saying the exact opposite, yet much the same thing: no more ivory tower, John; it's time to actually apply some of your insights to a real business -- after all, he now had a bunch of real, live customers that want help solving data center software infrastructure and IT operations issues.
Regardless of comments from either peanut gallery, John did indeed vacate Speen Street and trade his role running IDC's Enterprise Virtualization Service for a spot at Citrix. He's now focused on the overall strategy and messaging for their Virtualization and Management Division, the group built from the XenSource acquisition and that is actively working on virtualization alternatives to VMware -- and more.
I thought John's dual perspectives on the market (given his current and previous jobs) would make for a worthy Data Center Dialog interview, especially given Citrix's recent news that they will offer XenServer for free. And, there's a dual connection with Cassatt, too: long before they were acquired by Citrix, Cassatt and XenSource had worked together on making automated, dynamic data center infrastructure that much closer to reality (in fact, if you poke around our Resource Center or Partner page, you'll find some old webcasts and other content to that effect). And, we worked with John quite a lot in his IDC days.
In the first part of the interview that I'm posting today, I asked John about the virtualization market, criticism that Citrix has been getting, and what they have in store down the road.
Jay Fry, Data Center Dialog: John, from your "new" vantage point at Citrix, how do you see the virtualization market changing in 2009?
John Humphreys, Citrix: I see the basis for competition in virtualization changing significantly in 2009. In recent years, the innovation around virtualization has been squarely focused on the virtualization infrastructure itself with things like motion, HA, workload balancing, etc. The pace of innovation and rate of customer absorption of recent innovations has slowed. This was inevitable as the demand curve for new innovations around any technology eventually flattens.
Rather than competing to offer new features or functions on top of the base virtualization platform, I see the companies adding management capabilities that extend across multiple platforms. It's only through this ability to manage holistically that customers will truly be able to change the operational structure of today's complex IT environments.
DCD: How do you think Citrix will play into those changes?
John Humphreys: In a sentence, we are working to drive these changes in the virtualization marketplace.
Specifically, the company recently announced that the full XenServer platform would be freely available for download to anyone. This is a truly enterprise-class product -- not just a gimmicky download trial. It has live migration and full XenCenter management for free, with no limits on the number of servers or VMs a customer can host.
At the same time, we introduced a line of management products (branded Citrix Essentials) that are designed to provide advanced virtualization management across both XenServer and Hyper-V environments. Initially, we have focused on providing tools for integration and automation of mixed virtualization platform environments with capabilities like lab management, dynamic provisioning, and storage integration. Over time you will see more automation capabilities with links back to business policies and infrastructure thresholds.
DCD: What are your customers saying are the most important things that they are worrying about right now? How influenced by the economy are those priorities?
John Humphreys: Cost cutting. Pure and simple. We see the rapid economic decline setting the agenda for at least 2009 and was a major factor in the decision to offer XenServer for free. In tough economic times, well-documented cost saving measures like server consolidation are relied upon even more and being the low-cost provider of an enterprise-class virtualization solution provides Citrix with a opportunity to get XenServer in the hands of millions of customers.
DCD: I've seen some press coverage about disappointment regarding the XenSource acquisition by Citrix. The main complaint seems to be that Citrix isn’t being as aggressive as it should be in the space and VMware seems to be adding fuel to the fire by implying that Microsoft will eventually block out Citrix. What are your thoughts on how Citrix is handling XenSource and the competitive environment?
John Humphreys: I've heard those criticisms as well. I think what you have seen already in 2009 is that Citrix is become a lot more aggressive with XenServer. We think we have a distinct position in the marketplace and a unique opportunity. What you saw Citrix announce on February 23rd was the first opportunity to tell the full virtualization story. The free platform download (with motion and management) and the ability to add cross-platform advanced management capabilities is truly unique. We like where we are positioned going forward...and from the early indications, the market likes our position as well.
DCD: Why did you make the jump to the "dark side" of working for a vendor? Is the grass actually greener?
John Humphreys: I always knew I wanted to combine strategy and execution, as to me that is where the magic happens.
...
Up Next: In the second part of this interview, I ask John about his thoughts on (you guessed it) cloud computing, the pros and cons of what Citrix, VMware, and Microsoft have planned in that space, and how much different things look now that he's not an industry analyst with IDC.
Regardless of comments from either peanut gallery, John did indeed vacate Speen Street and trade his role running IDC's Enterprise Virtualization Service for a spot at Citrix. He's now focused on the overall strategy and messaging for their Virtualization and Management Division, the group built from the XenSource acquisition and that is actively working on virtualization alternatives to VMware -- and more.
I thought John's dual perspectives on the market (given his current and previous jobs) would make for a worthy Data Center Dialog interview, especially given Citrix's recent news that they will offer XenServer for free. And, there's a dual connection with Cassatt, too: long before they were acquired by Citrix, Cassatt and XenSource had worked together on making automated, dynamic data center infrastructure that much closer to reality (in fact, if you poke around our Resource Center or Partner page, you'll find some old webcasts and other content to that effect). And, we worked with John quite a lot in his IDC days.
In the first part of the interview that I'm posting today, I asked John about the virtualization market, criticism that Citrix has been getting, and what they have in store down the road.
Jay Fry, Data Center Dialog: John, from your "new" vantage point at Citrix, how do you see the virtualization market changing in 2009?
John Humphreys, Citrix: I see the basis for competition in virtualization changing significantly in 2009. In recent years, the innovation around virtualization has been squarely focused on the virtualization infrastructure itself with things like motion, HA, workload balancing, etc. The pace of innovation and rate of customer absorption of recent innovations has slowed. This was inevitable as the demand curve for new innovations around any technology eventually flattens.
Rather than competing to offer new features or functions on top of the base virtualization platform, I see the companies adding management capabilities that extend across multiple platforms. It's only through this ability to manage holistically that customers will truly be able to change the operational structure of today's complex IT environments.
DCD: How do you think Citrix will play into those changes?
John Humphreys: In a sentence, we are working to drive these changes in the virtualization marketplace.
Specifically, the company recently announced that the full XenServer platform would be freely available for download to anyone. This is a truly enterprise-class product -- not just a gimmicky download trial. It has live migration and full XenCenter management for free, with no limits on the number of servers or VMs a customer can host.
At the same time, we introduced a line of management products (branded Citrix Essentials) that are designed to provide advanced virtualization management across both XenServer and Hyper-V environments. Initially, we have focused on providing tools for integration and automation of mixed virtualization platform environments with capabilities like lab management, dynamic provisioning, and storage integration. Over time you will see more automation capabilities with links back to business policies and infrastructure thresholds.
DCD: What are your customers saying are the most important things that they are worrying about right now? How influenced by the economy are those priorities?
John Humphreys: Cost cutting. Pure and simple. We see the rapid economic decline setting the agenda for at least 2009 and was a major factor in the decision to offer XenServer for free. In tough economic times, well-documented cost saving measures like server consolidation are relied upon even more and being the low-cost provider of an enterprise-class virtualization solution provides Citrix with a opportunity to get XenServer in the hands of millions of customers.
DCD: I've seen some press coverage about disappointment regarding the XenSource acquisition by Citrix. The main complaint seems to be that Citrix isn’t being as aggressive as it should be in the space and VMware seems to be adding fuel to the fire by implying that Microsoft will eventually block out Citrix. What are your thoughts on how Citrix is handling XenSource and the competitive environment?
John Humphreys: I've heard those criticisms as well. I think what you have seen already in 2009 is that Citrix is become a lot more aggressive with XenServer. We think we have a distinct position in the marketplace and a unique opportunity. What you saw Citrix announce on February 23rd was the first opportunity to tell the full virtualization story. The free platform download (with motion and management) and the ability to add cross-platform advanced management capabilities is truly unique. We like where we are positioned going forward...and from the early indications, the market likes our position as well.
DCD: Why did you make the jump to the "dark side" of working for a vendor? Is the grass actually greener?
John Humphreys: I always knew I wanted to combine strategy and execution, as to me that is where the magic happens.
...
Up Next: In the second part of this interview, I ask John about his thoughts on (you guessed it) cloud computing, the pros and cons of what Citrix, VMware, and Microsoft have planned in that space, and how much different things look now that he's not an industry analyst with IDC.
Friday, March 6, 2009
Nicholas Carr: IT and economy are cloudy, but the Big Switch is on
Posted by
Jay Fry
at
9:35 AM
For those at IDC's Directions conference this week in San Jose (my highlights are posted here) who hadn't yet read his book, The Big Switch, Nicholas Carr used his keynote to walk through his "IT-is-going-to-be-like-the-electrical-utility" metaphor, and his reasoning behind it. Many, I'm sure, had heard it before (some even complained about it a bit on Twitter). But that doesn't make it any less likely to come true.
IT, he argued, is the next resource to go through a very similar transformation to what happened with electricity, moving from private generation of what was needed to a public grid. Grove's law about bandwidth's growth has "been repealed," said Carr. It is "the beginning of the next great sea change in information technology" -- the move to utility or cloud computing. (For more detail on this, Andy Patrizio of Internetnews.com also posted a good run-down.) So far, so good, but here's a question: has anything shifted since Carr's book came out? How far have we come?
Big changes in attitudes on cloud/utility computing since last year
In the 14 months since The Big Switch was published, people's attitudes have changed dramatically. What was greeted (even in Carr's estimation during his keynote) with skepticism early last year is pretty much a done deal this year. We're moving to "simply a better model for computing without hardly even noticing it," he said. "Cloud computing has become the center of investment and innovation."
Now, you can argue about the speed and impact of the changes, but it's hard to argue that the change isn't happening. Gordon Haff of Illuminata published a good thought-piece on whether or not what's happening is really as profound as Carr suggests, and whether data centers are even seeing those changes yet. Healthy skepticism, for sure, especially when talking about the data centers of big organizations.
However, Carr used his time onstage this week at the IDC conference to extend the vision his book laid out and talk a bit about the intermediary steps we're going through in the "big switch." In his view, cloud computing can be many things for organizations. It can be a new do-it-yourself model for IT. A supplement to whatever IT exists that's "shovel-ready." A replacement. A democratizer. Even a complete revolution in which IT and business finally get on the same page, shocking as that may be.
No matter which view (or, more likely, combination of views) ends up being true, Carr was clear to IT folks and businesses alike, if a little understated: "It behooves you to look into this. Figuring out how to harness the power" of a public cloud "may be the great enterprise of this century."
I think on this point Carr, IDC and their analyst brethren, and vendors like my company, Cassatt, and others are all pretty unanimous in what they're saying. Despite some false starts at the beginning of the new century, the big change for IT -- being able to actually use utility-style computing -- is really here. It's called cloud computing. Of course, it's still early days, but let the fun begin. Actually, if you hadn't noticed, it already has.
Now, comes the messy part: making this new model work
Carr was pretty clear that this is going to take a few steps. No arguments here. And while he did joke about the term "private clouds" being an oxymoron if there ever was one (my Appirio Twitter friends loved that one; see also the many previous posts here and elsewhere arguing that topic pro and con), he also talked about private clouds as one of the many steps along this path. In looking at applying the cloud model to the data center infrastructures organizations already have, we are asking "what can we learn about how the cloud operators do IT -- and revamp our own data centers on the cloud model." To Carr, internal cloud computing sounded like a "big opportunity while a public cloud is being built out."
This is also where things get messy for today's big IT suppliers, thanks to a little thing called the innovator's dilemma (from Clayton Christensen). "Big traditional vendors are in quite a fix today," said Carr. "You have big vendors making investments in cloud computing, but it's almost on faith because they haven't figured out how to make money on it yet. The problem is not in the technology, it's in the business model."
Add to this a tidbit from a survey IDC's Frank Gens talked up earlier in the day: the least important buying criteria for cloud services were that the company be a large, established company, or that the company had done business with an organization before. Well, now...that certainly opens up the playing field quite a bit.
Carr pointed to a number of hurdles that still exist, like having to hit a level of reliability beyond what's been required today just to prove all this cloud stuff is safe. However, based on what I hear from all sides (especially customers), I think the engine of change has really started up on cloud computing. Everyone is talking about it, testing the hypotheses from all angles. But that's the easy part. More importantly, though, people are trying it. Some are moving fast; some are moving cautiously. Some are trying external clouds; some are applying the cloud concept to the resources inside their own data centers. And, the "cloudy" (er, grim) economy, as Carr called it (and as Frank Gens noted earlier in the day), is probably helping in its own way, too.
As unlikely as it may have seemed when his book came out last year, Carr's "big switch" is on.
IT, he argued, is the next resource to go through a very similar transformation to what happened with electricity, moving from private generation of what was needed to a public grid. Grove's law about bandwidth's growth has "been repealed," said Carr. It is "the beginning of the next great sea change in information technology" -- the move to utility or cloud computing. (For more detail on this, Andy Patrizio of Internetnews.com also posted a good run-down.) So far, so good, but here's a question: has anything shifted since Carr's book came out? How far have we come?
Big changes in attitudes on cloud/utility computing since last year
In the 14 months since The Big Switch was published, people's attitudes have changed dramatically. What was greeted (even in Carr's estimation during his keynote) with skepticism early last year is pretty much a done deal this year. We're moving to "simply a better model for computing without hardly even noticing it," he said. "Cloud computing has become the center of investment and innovation."
Now, you can argue about the speed and impact of the changes, but it's hard to argue that the change isn't happening. Gordon Haff of Illuminata published a good thought-piece on whether or not what's happening is really as profound as Carr suggests, and whether data centers are even seeing those changes yet. Healthy skepticism, for sure, especially when talking about the data centers of big organizations.
However, Carr used his time onstage this week at the IDC conference to extend the vision his book laid out and talk a bit about the intermediary steps we're going through in the "big switch." In his view, cloud computing can be many things for organizations. It can be a new do-it-yourself model for IT. A supplement to whatever IT exists that's "shovel-ready." A replacement. A democratizer. Even a complete revolution in which IT and business finally get on the same page, shocking as that may be.
No matter which view (or, more likely, combination of views) ends up being true, Carr was clear to IT folks and businesses alike, if a little understated: "It behooves you to look into this. Figuring out how to harness the power" of a public cloud "may be the great enterprise of this century."
I think on this point Carr, IDC and their analyst brethren, and vendors like my company, Cassatt, and others are all pretty unanimous in what they're saying. Despite some false starts at the beginning of the new century, the big change for IT -- being able to actually use utility-style computing -- is really here. It's called cloud computing. Of course, it's still early days, but let the fun begin. Actually, if you hadn't noticed, it already has.
Now, comes the messy part: making this new model work
Carr was pretty clear that this is going to take a few steps. No arguments here. And while he did joke about the term "private clouds" being an oxymoron if there ever was one (my Appirio Twitter friends loved that one; see also the many previous posts here and elsewhere arguing that topic pro and con), he also talked about private clouds as one of the many steps along this path. In looking at applying the cloud model to the data center infrastructures organizations already have, we are asking "what can we learn about how the cloud operators do IT -- and revamp our own data centers on the cloud model." To Carr, internal cloud computing sounded like a "big opportunity while a public cloud is being built out."
This is also where things get messy for today's big IT suppliers, thanks to a little thing called the innovator's dilemma (from Clayton Christensen). "Big traditional vendors are in quite a fix today," said Carr. "You have big vendors making investments in cloud computing, but it's almost on faith because they haven't figured out how to make money on it yet. The problem is not in the technology, it's in the business model."
Add to this a tidbit from a survey IDC's Frank Gens talked up earlier in the day: the least important buying criteria for cloud services were that the company be a large, established company, or that the company had done business with an organization before. Well, now...that certainly opens up the playing field quite a bit.
Carr pointed to a number of hurdles that still exist, like having to hit a level of reliability beyond what's been required today just to prove all this cloud stuff is safe. However, based on what I hear from all sides (especially customers), I think the engine of change has really started up on cloud computing. Everyone is talking about it, testing the hypotheses from all angles. But that's the easy part. More importantly, though, people are trying it. Some are moving fast; some are moving cautiously. Some are trying external clouds; some are applying the cloud concept to the resources inside their own data centers. And, the "cloudy" (er, grim) economy, as Carr called it (and as Frank Gens noted earlier in the day), is probably helping in its own way, too.
As unlikely as it may have seemed when his book came out last year, Carr's "big switch" is on.
Thursday, March 5, 2009
IDC: Downward Directions for IT in 2009 leave room for cloud computing uptick
Posted by
Jay Fry
at
3:55 PM
IDC's 44th annual Directions conference in San Jose this week may be the longest running IT conference in the world, but it didn't pull any punches on the economy. From John Gantz's opening keynote through every track session I attended, the analysts recounted what anyone running a data center knows all too well: IT spending is pulling way back. IDC wisely did a mid-year course-correction on their 2009 spending prediction at the end of last year, and they used some of these revisions at the conference to show how far -- and how fast -- things have headed down. As I was sitting in the audience, I started to wonder if even those revisions were deep enough. Only cloud computing escaped the dour forecast (more on that in a minute).
Here's a quick summary of the key points I took away from the conference, focusing on IDC's take on the macro-level IT environment, the impact of the economy on running a data center, and -- the lone bright spot -- how cloud computing figures into all this. On that last point, let's just say Frank Gens, the day's cloud presenter, was positively giddy to be the one guy who got to deliver good news. The highlights:
The economy has us in a dark, dark place -- but IT is needed now more than ever
John Gantz, IDC's chief research officer, summed up the effect of the economy at the start of the day: "I don't think we've been here before. We're in new territory. We're in the dark" because we don't have a very good handle on what the economy's going to do next. Gantz noted that IDC has ratcheted down IT spending predictions for this year to nearly flat over 2008 (up only 0.5%). That doesn't take into account any effect from the Obama stimulus package (or those from other governments elsewhere in the world). IDC told their analysts not to try to quantify stimulus package impact, said Gantz, but to assume they "won't make things worse." Let's hope. One positive note: 2010's growth rate looks positively robust, but of course that's because it's building on the catastrophe that 2009 is working out to be.
However, says Gantz, the bad economy is not slowing down the increase in mobile Internet users, the adoption of non-traditional computing devices, nor is it putting the brakes on the amount of data being gathered or user interactions per day (predicted to increase to 8.4 times its current rate in the next 4 years). And that's all something for IT to deal with.
So, said Gantz, amid this "extinction event," there are incredible new demands for management. "The economic crisis changes everything and it changes nothing. We have a new, new normal." The current situation merely forces the issue on seeing and doing things differently. "If everything is crashing down around you," said Gantz, "now is a good time to take a risk. Now is a period of opportunity." He noted companies like Hyatt, GE, RIM, FedEx, HP, and IBM had all been started in recessions (I've also written about great innovations during previous downturns).
What opportunities did he see in particular right now? Gantz noted enterprise social media, IT outsourcing, virtualization software, and Internet advertising (really). Of particular note: virtualization management software. Which has a big impact on IDC's view of what's happening in the data center...
The move to more modular, pay-as-you-go data centers -- with warnings about virtualization management
Michelle Bailey, presenting her content from IDC's Data Center Trends program, seemed very concerned about how hard and complex managing a data center had become, and believed that we're going to see customers making moves to simplify things out of necessity.
The recession, said Bailey, "changes the decision on where to hold the [data center] assets." Its main impact is to push data center managers to move "from a fixed price model to a variable pricing model," to move costs from cap ex to op ex.
Virtualization has had a huge impact so far, and will continue to do so, according to Matt Eastwood, IDC group vice president for enterprise platforms. In fact, there will be more VMs than physical servers deployed in 2009. "It will be the cross-over year," said Eastwood.
However, that drives big, big concerns on how data center managers are going to cope, said Bailey. "The thing I worry about the most with virtualization is the management consequences. There's no way to manage this with the processes and tools in place today." In fact, Bailey is so worried that she thinks this "virtualization management gap" might stall the virtualization market itself as users search for management solutions. "I’m worried that customers may have gone too far and may have to dial it back," she said. "The challenge in the server virtualization world is that people aren't used to spending a lot of money on systems management tools."
When we at Cassatt talk to customers about this, we've found that they know there is a virtual management problem and are actively trying to address it. The approach we talk to these customers about is having a coherent strategy for managing all of your data center components based upon the application service levels you need, regardless of whether the compute resources are physical or virtual. Having a separate management stack for each virtualization vendor and another one for their physical systems is not appealing, to say the least.
One of Bailey's other most important points was that there isn't just one type of data center -- there are actually three:
1. Enterprise-style data centers focus on SLAs, cost containment, and are dealing with space issues.
2. Hosting/outsourcer data centers focus on doing what's necessary to meet customer demand.
3. Web 2.0/telco-style data centers are all about cost efficiency and growth.
Trying to compare how you run your data center with one that has a different set of goals is not productive and will get you focused on the wrong things -- and result in more of a mess.
She did say, however, no matter what type of data center you are running, to look at doing things in a much more modular way, as a way to simplify. Bailey called "massively modular" the blueprint for the future data center. This helps break down big problems into smaller, more manageable ones, and ensures that you don't have to be absolutely correct in your 20-year vision for your data center. She sees things like containerized data centers becoming more standardized and less proprietary, making this modular approach more complimentary than disruptive to what data centers are already doing. And, with power and cooling still a huge problem for data centers, IT ops and facilities need help from both a more modular approach and the "pretty sophisticated" power management tools that exist. (I like to think that she was thinking of us at this point in her presentation.)
Cloud computing is on track to move to the mainstream -- and show actual growth despite the economy
Bailey had a healthy dose of cloud computing skepticism in her break-out presentation: "Anything that has money attached to it can’' be [in the cloud] for another 10 years," she said, clearly paving the way for big organizations with security, compliance, and lock-in concerns to give this cloud model a try, but to do so within their own data centers as an internal cloud.
In Frank Gens' keynote on cloud computing, he acknowledged a lot of the concerns that companies have been expressing about going to an external cloud, however, was very upbeat. "The idea of cloud is of very, very high interest to CIOs in the market right now," he said. Last year IDC predicted that 2009 would be "the year of moving from the sandbox to the mainstream," said Gens. "We are certainly on that path right now."
Why? Maybe not for the reasons you might think (cost). Gens corroborated comments from Gartner's Tom Bittman at their Data Center Conference back in December: the No. 1 reason that people want to move to the cloud is that "it's fast" to do so.
This new cloud model hasn't yet bulldozed the old model for IT, according to IDC, for reasons we've heard (and Michelle Bailey mentioned above): deficiencies in security, performance, availability, plus problems integrating with in-house IT. Gens sees cloud computing beginning the move across Geoffrey Moore's chasm toward mainstream adoption as a result of a couple things: performance-level assurances and being able to connect back to on-premise systems.
"Service level assurances are going to be critical for us to move this market [for cloud computing] to the mainstream," said Gens. And, customers want the ability to do hybrid public/private cloud computing: "They want a bridge and they want it to be a two-way bridge" between their public and private clouds.
And, despite all the economic negativity, IDC painted a pretty rosy picture for cloud computing, noting that it's where the new IT spending growth would be happening. Gens described it as the beginning of the move to a more dynamic deployment of IT infrastructure, and part of an expanding portfolio of options for the CIO.
"We’re right where we were when the PC came along or when the Internet first came out," said Gens. As far as directions go, that's pretty much "up."
Up next: comments on Nicholas Carr's closing keynote at IDC Directions San Jose. Slides from the IDC presentations noted above are available for IDC customers in PDF format in their event archives at www.IDC.com.
Here's a quick summary of the key points I took away from the conference, focusing on IDC's take on the macro-level IT environment, the impact of the economy on running a data center, and -- the lone bright spot -- how cloud computing figures into all this. On that last point, let's just say Frank Gens, the day's cloud presenter, was positively giddy to be the one guy who got to deliver good news. The highlights:
The economy has us in a dark, dark place -- but IT is needed now more than ever
John Gantz, IDC's chief research officer, summed up the effect of the economy at the start of the day: "I don't think we've been here before. We're in new territory. We're in the dark" because we don't have a very good handle on what the economy's going to do next. Gantz noted that IDC has ratcheted down IT spending predictions for this year to nearly flat over 2008 (up only 0.5%). That doesn't take into account any effect from the Obama stimulus package (or those from other governments elsewhere in the world). IDC told their analysts not to try to quantify stimulus package impact, said Gantz, but to assume they "won't make things worse." Let's hope. One positive note: 2010's growth rate looks positively robust, but of course that's because it's building on the catastrophe that 2009 is working out to be.
However, says Gantz, the bad economy is not slowing down the increase in mobile Internet users, the adoption of non-traditional computing devices, nor is it putting the brakes on the amount of data being gathered or user interactions per day (predicted to increase to 8.4 times its current rate in the next 4 years). And that's all something for IT to deal with.
So, said Gantz, amid this "extinction event," there are incredible new demands for management. "The economic crisis changes everything and it changes nothing. We have a new, new normal." The current situation merely forces the issue on seeing and doing things differently. "If everything is crashing down around you," said Gantz, "now is a good time to take a risk. Now is a period of opportunity." He noted companies like Hyatt, GE, RIM, FedEx, HP, and IBM had all been started in recessions (I've also written about great innovations during previous downturns).
What opportunities did he see in particular right now? Gantz noted enterprise social media, IT outsourcing, virtualization software, and Internet advertising (really). Of particular note: virtualization management software. Which has a big impact on IDC's view of what's happening in the data center...
The move to more modular, pay-as-you-go data centers -- with warnings about virtualization management
Michelle Bailey, presenting her content from IDC's Data Center Trends program, seemed very concerned about how hard and complex managing a data center had become, and believed that we're going to see customers making moves to simplify things out of necessity.
The recession, said Bailey, "changes the decision on where to hold the [data center] assets." Its main impact is to push data center managers to move "from a fixed price model to a variable pricing model," to move costs from cap ex to op ex.
Virtualization has had a huge impact so far, and will continue to do so, according to Matt Eastwood, IDC group vice president for enterprise platforms. In fact, there will be more VMs than physical servers deployed in 2009. "It will be the cross-over year," said Eastwood.
However, that drives big, big concerns on how data center managers are going to cope, said Bailey. "The thing I worry about the most with virtualization is the management consequences. There's no way to manage this with the processes and tools in place today." In fact, Bailey is so worried that she thinks this "virtualization management gap" might stall the virtualization market itself as users search for management solutions. "I’m worried that customers may have gone too far and may have to dial it back," she said. "The challenge in the server virtualization world is that people aren't used to spending a lot of money on systems management tools."
When we at Cassatt talk to customers about this, we've found that they know there is a virtual management problem and are actively trying to address it. The approach we talk to these customers about is having a coherent strategy for managing all of your data center components based upon the application service levels you need, regardless of whether the compute resources are physical or virtual. Having a separate management stack for each virtualization vendor and another one for their physical systems is not appealing, to say the least.
One of Bailey's other most important points was that there isn't just one type of data center -- there are actually three:
1. Enterprise-style data centers focus on SLAs, cost containment, and are dealing with space issues.
2. Hosting/outsourcer data centers focus on doing what's necessary to meet customer demand.
3. Web 2.0/telco-style data centers are all about cost efficiency and growth.
Trying to compare how you run your data center with one that has a different set of goals is not productive and will get you focused on the wrong things -- and result in more of a mess.
She did say, however, no matter what type of data center you are running, to look at doing things in a much more modular way, as a way to simplify. Bailey called "massively modular" the blueprint for the future data center. This helps break down big problems into smaller, more manageable ones, and ensures that you don't have to be absolutely correct in your 20-year vision for your data center. She sees things like containerized data centers becoming more standardized and less proprietary, making this modular approach more complimentary than disruptive to what data centers are already doing. And, with power and cooling still a huge problem for data centers, IT ops and facilities need help from both a more modular approach and the "pretty sophisticated" power management tools that exist. (I like to think that she was thinking of us at this point in her presentation.)
Cloud computing is on track to move to the mainstream -- and show actual growth despite the economy
Bailey had a healthy dose of cloud computing skepticism in her break-out presentation: "Anything that has money attached to it can’' be [in the cloud] for another 10 years," she said, clearly paving the way for big organizations with security, compliance, and lock-in concerns to give this cloud model a try, but to do so within their own data centers as an internal cloud.
In Frank Gens' keynote on cloud computing, he acknowledged a lot of the concerns that companies have been expressing about going to an external cloud, however, was very upbeat. "The idea of cloud is of very, very high interest to CIOs in the market right now," he said. Last year IDC predicted that 2009 would be "the year of moving from the sandbox to the mainstream," said Gens. "We are certainly on that path right now."
Why? Maybe not for the reasons you might think (cost). Gens corroborated comments from Gartner's Tom Bittman at their Data Center Conference back in December: the No. 1 reason that people want to move to the cloud is that "it's fast" to do so.
This new cloud model hasn't yet bulldozed the old model for IT, according to IDC, for reasons we've heard (and Michelle Bailey mentioned above): deficiencies in security, performance, availability, plus problems integrating with in-house IT. Gens sees cloud computing beginning the move across Geoffrey Moore's chasm toward mainstream adoption as a result of a couple things: performance-level assurances and being able to connect back to on-premise systems.
"Service level assurances are going to be critical for us to move this market [for cloud computing] to the mainstream," said Gens. And, customers want the ability to do hybrid public/private cloud computing: "They want a bridge and they want it to be a two-way bridge" between their public and private clouds.
And, despite all the economic negativity, IDC painted a pretty rosy picture for cloud computing, noting that it's where the new IT spending growth would be happening. Gens described it as the beginning of the move to a more dynamic deployment of IT infrastructure, and part of an expanding portfolio of options for the CIO.
"We’re right where we were when the PC came along or when the Internet first came out," said Gens. As far as directions go, that's pretty much "up."
Up next: comments on Nicholas Carr's closing keynote at IDC Directions San Jose. Slides from the IDC presentations noted above are available for IDC customers in PDF format in their event archives at www.IDC.com.
Wednesday, January 7, 2009
IDC's Al Gillen: Is a bad economy good for cloud computing?
Posted by
Jay Fry
at
3:22 PM
Today's post is Part 2 of our recent interview with Al Gillen, program vice president of system software at IDC.
In the first part of our interview with Al that I posted yesterday, Al noted that virtualization is still on the rise, but more because of use cases like HA and DR than the traditional drive to simply consolidate servers. He saw systems management taking a much more important role, and virtualization serving as a "proof of concept" for cloud computing. The move to cloud computing is a transition that he says we will be continuing to talk about and work through "for the next 15 years." To me that’s a good indication of the impactful, fundamental change possible with the cloud. And the work that's still ahead of us.
In today's post, I asked him a bit more about cloud computing and one of the questions that's on everyone's minds: despite all the excitement on the topic (and, yes, the cloud computing PR train continues unabated in the new year, if my inbox is any indication), how will a dire economy affect all of this? His answer: don't expect people to suddenly make radical changes in dangerous times. They'll do what makes sense, after seeing some relevant proof points.
Here's the final excerpt from the interview:
Jay Fry, Data Center Dialog: Here's the "$700-billion question": I've heard two different schools of thought about how the current economic, um, tailspin will affect cloud computing. Do you see it speeding things along because of the chance to spend less (and get more flexibility), or slowing things down as IT becomes more cautious with any and all spending?
Al Gillen, IDC: This is a tough call. Cloud sounds good, but in reality, unless cloud computing really offers customers a seamless way to expand their resources, and does so at a lower cost, it will not be a short-term solution. Even if it does achieve these benefits in the first instantiation, cloud computing still has to overcome conservative concerns about things like data security, integrity, privacy, availability, and more. Corporate IT managers didn't get to their current job positions by taking excessive risks with company assets.
DCD: How long do you think the move toward cloud computing will take? What are the most important drivers and what's necessary for people to get comfortable with it?
Al Gillen: As I noted [in Part 1 of this interview], I believe that we will be talking about the transition to cloud computing for the next 15 years. Let's consider x86 virtualization, a market segment that VMware commercialized. VMware has been around for a decade now, and has had products in the market for about 8 years. Despite VMware's phenomenal success and billion-plus [dollar] revenue run rate [per year], the x86 server market remains only lightly penetrated by virtualization software. How much longer will it take for x86 server virtualization to become pervasive? Certainly yet another 5 years, at a minimum. It is unlikely that "cloud" can turn the industry upside down any faster.
DCD: You've heard talk from us and I'm sure others as well about internal cloud computing as a way to be able to help folks get the benefits of cloud computing without a lot of the current negatives -- and to do so using the heterogeneous IT resources they already have. What role do you see for internal cloud computing?
Al Gillen: Internal cloud computing represents a wonderful opportunity to get comfortable with the concept, and frankly, has the distinct potential to allow customers to lower their overall IT expense and raise their "greenness" using resources they have in place today.
…
On that note, I'd like to thank Al for being our first interview guinea pig. Despite the huge amount of interest around cloud computing and many vendor wishes to the contrary, what we hear from customers matches a lot of what Al says. The move toward running your data center differently comes in incremental steps, and those steps can't be rushed.
However, this sour economic climate is a great reason to champion IT projects and approaches that can save money and deliver results immediately. Who doesn't want to find a better, cheaper way to do things right now? (That's the reason we try to start out customer conversations with the payback/results requirements -- what is it you need to accomplish with your infrastructure management? Many people start with our IT ops and server power savings calculators to give them an initial picture.)
If you're interested in any of the research supporting Al's interview comments, you can get it (provided you're an IDC client) at their website. Frank Gens also produces IDC eXchange, an IDC blog with other, unrestricted commentary (including a lot about cloud computing and their predictions for IT in 2009) that Ken Oestreich pointed out to me. Al also noted that they have a Jan. 20 webcast to review those 2009 IT predictions that's open to the public. "We will talk about virtualization and cloud there, among other topics," he said.
In the first part of our interview with Al that I posted yesterday, Al noted that virtualization is still on the rise, but more because of use cases like HA and DR than the traditional drive to simply consolidate servers. He saw systems management taking a much more important role, and virtualization serving as a "proof of concept" for cloud computing. The move to cloud computing is a transition that he says we will be continuing to talk about and work through "for the next 15 years." To me that’s a good indication of the impactful, fundamental change possible with the cloud. And the work that's still ahead of us.
In today's post, I asked him a bit more about cloud computing and one of the questions that's on everyone's minds: despite all the excitement on the topic (and, yes, the cloud computing PR train continues unabated in the new year, if my inbox is any indication), how will a dire economy affect all of this? His answer: don't expect people to suddenly make radical changes in dangerous times. They'll do what makes sense, after seeing some relevant proof points.
Here's the final excerpt from the interview:
Jay Fry, Data Center Dialog: Here's the "$700-billion question": I've heard two different schools of thought about how the current economic, um, tailspin will affect cloud computing. Do you see it speeding things along because of the chance to spend less (and get more flexibility), or slowing things down as IT becomes more cautious with any and all spending?
Al Gillen, IDC: This is a tough call. Cloud sounds good, but in reality, unless cloud computing really offers customers a seamless way to expand their resources, and does so at a lower cost, it will not be a short-term solution. Even if it does achieve these benefits in the first instantiation, cloud computing still has to overcome conservative concerns about things like data security, integrity, privacy, availability, and more. Corporate IT managers didn't get to their current job positions by taking excessive risks with company assets.
DCD: How long do you think the move toward cloud computing will take? What are the most important drivers and what's necessary for people to get comfortable with it?
Al Gillen: As I noted [in Part 1 of this interview], I believe that we will be talking about the transition to cloud computing for the next 15 years. Let's consider x86 virtualization, a market segment that VMware commercialized. VMware has been around for a decade now, and has had products in the market for about 8 years. Despite VMware's phenomenal success and billion-plus [dollar] revenue run rate [per year], the x86 server market remains only lightly penetrated by virtualization software. How much longer will it take for x86 server virtualization to become pervasive? Certainly yet another 5 years, at a minimum. It is unlikely that "cloud" can turn the industry upside down any faster.
DCD: You've heard talk from us and I'm sure others as well about internal cloud computing as a way to be able to help folks get the benefits of cloud computing without a lot of the current negatives -- and to do so using the heterogeneous IT resources they already have. What role do you see for internal cloud computing?
Al Gillen: Internal cloud computing represents a wonderful opportunity to get comfortable with the concept, and frankly, has the distinct potential to allow customers to lower their overall IT expense and raise their "greenness" using resources they have in place today.
…
On that note, I'd like to thank Al for being our first interview guinea pig. Despite the huge amount of interest around cloud computing and many vendor wishes to the contrary, what we hear from customers matches a lot of what Al says. The move toward running your data center differently comes in incremental steps, and those steps can't be rushed.
However, this sour economic climate is a great reason to champion IT projects and approaches that can save money and deliver results immediately. Who doesn't want to find a better, cheaper way to do things right now? (That's the reason we try to start out customer conversations with the payback/results requirements -- what is it you need to accomplish with your infrastructure management? Many people start with our IT ops and server power savings calculators to give them an initial picture.)
If you're interested in any of the research supporting Al's interview comments, you can get it (provided you're an IDC client) at their website. Frank Gens also produces IDC eXchange, an IDC blog with other, unrestricted commentary (including a lot about cloud computing and their predictions for IT in 2009) that Ken Oestreich pointed out to me. Al also noted that they have a Jan. 20 webcast to review those 2009 IT predictions that's open to the public. "We will talk about virtualization and cloud there, among other topics," he said.
Tuesday, January 6, 2009
Interview with IDC's Al Gillen: in 2009, virtualization will help bridge to the cloud
Posted by
Jay Fry
at
2:19 PM
When I first started this blog a few months back, I told you we'd do our best to bring you a cogent collection of useful ideas, opinions, and other relevant tidbits from those steeped in the world of running data centers. So far, that's generally taken the form of blogging from industry events (like my posts from the Gartner Data Center Conference in Vegas in early December) or commentary on IT ops topics sparked by an article or event. Our chief engineer, Craig Vosburgh, even got in on the act last month, launching the first in a series of posts scrutinizing the necessary components of cloud computing, relying on his experience in this space (and his highly tuned B.S. detector, I might add) to help cut through the hype.
And now for something completely different.
In that first blog post, I promised we'd feature give-and-take interviews with people you should know, in and around the world of IT. We're starting the new year off with a bang, in fact: today's post is the first of these interviews -- an interview with Al Gillen, program vice president of system software at IDC. When we talked to Al about Cassatt's "internal cloud" capabilities announcement in November, he had some interesting observations about virtualization, cloud computing, and where the industry is headed.
I recently had a chance to ask him about some of these comments in a little more detail, and I'm posting his responses today and tomorrow. By the way, Al was just recently promoted to run the Enterprise Virtualization Software research team at IDC following the departure of John Humphreys to Citrix. Nicely done, Al.
In today's excerpt, Al notes some shifts in the reasons people are virtualizing servers. In 2009, he says, it won't all be about consolidation. But, as the shift toward virtualization continues, the need for a way to manage virtual systems (in parallel with physical servers, I might add) becomes much more urgent. He believes this move toward virtualization is actually a bridge to cloud computing.
Here’s an excerpt of that interview:
Jay Fry, Cassatt Data Center Dialog: Congrats on the expanded role at IDC, Al. When we talked last you mentioned the industry was starting to see the "secondary impact of virtualization" -- one of which being a necessary intersection between systems management and virtualization. Can you explain a little about what you are seeing out there?
Al Gillen, IDC: The first wave of virtualization adoption was heavily weighted to consolidation and server footprint reduction. Those use cases were a natural fit for virtualization, and to be honest, there is still lots of consolidation to take place in the industry. But users find out very quickly that server consolidation does nothing to simplify the management of the operating systems and layered software, and in some cases, can make it more complex, especially if there is any balancing going on aboard the virtualized servers. Therefore, organizations that did not have a strong management solution in place before beginning to adopt virtualization will be finding that is their next most important acquisition.
DCD: How do you see the virtualization market changing in 2009?
Al Gillen: Our survey data is finding that consolidation is no longer the stand-out favorite use case for virtualization. To be sure, consolidation is not going away, but other use cases are growing in popularity as a primary objective for deploying a virtualized solution. Use cases such as high availability and disaster recovery are emerging as strong use cases today. The other factor in play here is that a lot of the largest consolidation opportunities -- not all of them, but many of them -- already have either been consolidated or are committed to a consolidation strategy and to a given vendor. As a result, the "green field" for large scale consolidations is becoming less green and less big. However, the opportunity for larger numbers of smaller scale consolidations is still enormous.
DCD: How will the economy affect how organizations are adopting virtualization?
Al Gillen: The economic conditions are likely to drive customers more quickly toward virtualization simply because the IT departments, which often are seen by management as a cost center, will be charged with reducing expenses, while still delivering the services the business needs to be successful. Being charged with "doing more with less" has been the mandate that has helped many of the transitional technologies that we now consider mainstream to get a first foothold. Can you say Linux? x86? Or Windows?
DCD: You probably guessed I was going to ask a cloud computing question. How do you think what's going on with virtualization intersects with or is different from cloud computing?
Al Gillen: There is no question that virtualization will serve as the bridge to bring customers from today's physical data center to the compute resource that will serve as tomorrow's data center. Virtualization serves several roles in this transition, including a "proof of concept" that you really can pool multiple smaller machines into a larger resource and begin to allocate and consume the resulting superset of resources in an intelligent, scalable, and reliable manner. But virtualization also will be the lubricant that makes it possible to slide software stacks off the hardware foundations to which the software used to be permanently attached. That said, cloud will be adopted very much on a transitional basis, and will penetrate different verticals and different platforms at different rates. I believe we will be talking about the transition to cloud computing for the next 15 years.
Next time: more from Al on cloud computing and the impact of the economic meltdown.
And now for something completely different.
In that first blog post, I promised we'd feature give-and-take interviews with people you should know, in and around the world of IT. We're starting the new year off with a bang, in fact: today's post is the first of these interviews -- an interview with Al Gillen, program vice president of system software at IDC. When we talked to Al about Cassatt's "internal cloud" capabilities announcement in November, he had some interesting observations about virtualization, cloud computing, and where the industry is headed.
I recently had a chance to ask him about some of these comments in a little more detail, and I'm posting his responses today and tomorrow. By the way, Al was just recently promoted to run the Enterprise Virtualization Software research team at IDC following the departure of John Humphreys to Citrix. Nicely done, Al.
In today's excerpt, Al notes some shifts in the reasons people are virtualizing servers. In 2009, he says, it won't all be about consolidation. But, as the shift toward virtualization continues, the need for a way to manage virtual systems (in parallel with physical servers, I might add) becomes much more urgent. He believes this move toward virtualization is actually a bridge to cloud computing.
Here’s an excerpt of that interview:
Jay Fry, Cassatt Data Center Dialog: Congrats on the expanded role at IDC, Al. When we talked last you mentioned the industry was starting to see the "secondary impact of virtualization" -- one of which being a necessary intersection between systems management and virtualization. Can you explain a little about what you are seeing out there?
Al Gillen, IDC: The first wave of virtualization adoption was heavily weighted to consolidation and server footprint reduction. Those use cases were a natural fit for virtualization, and to be honest, there is still lots of consolidation to take place in the industry. But users find out very quickly that server consolidation does nothing to simplify the management of the operating systems and layered software, and in some cases, can make it more complex, especially if there is any balancing going on aboard the virtualized servers. Therefore, organizations that did not have a strong management solution in place before beginning to adopt virtualization will be finding that is their next most important acquisition.
DCD: How do you see the virtualization market changing in 2009?
Al Gillen: Our survey data is finding that consolidation is no longer the stand-out favorite use case for virtualization. To be sure, consolidation is not going away, but other use cases are growing in popularity as a primary objective for deploying a virtualized solution. Use cases such as high availability and disaster recovery are emerging as strong use cases today. The other factor in play here is that a lot of the largest consolidation opportunities -- not all of them, but many of them -- already have either been consolidated or are committed to a consolidation strategy and to a given vendor. As a result, the "green field" for large scale consolidations is becoming less green and less big. However, the opportunity for larger numbers of smaller scale consolidations is still enormous.
DCD: How will the economy affect how organizations are adopting virtualization?
Al Gillen: The economic conditions are likely to drive customers more quickly toward virtualization simply because the IT departments, which often are seen by management as a cost center, will be charged with reducing expenses, while still delivering the services the business needs to be successful. Being charged with "doing more with less" has been the mandate that has helped many of the transitional technologies that we now consider mainstream to get a first foothold. Can you say Linux? x86? Or Windows?
DCD: You probably guessed I was going to ask a cloud computing question. How do you think what's going on with virtualization intersects with or is different from cloud computing?
Al Gillen: There is no question that virtualization will serve as the bridge to bring customers from today's physical data center to the compute resource that will serve as tomorrow's data center. Virtualization serves several roles in this transition, including a "proof of concept" that you really can pool multiple smaller machines into a larger resource and begin to allocate and consume the resulting superset of resources in an intelligent, scalable, and reliable manner. But virtualization also will be the lubricant that makes it possible to slide software stacks off the hardware foundations to which the software used to be permanently attached. That said, cloud will be adopted very much on a transitional basis, and will penetrate different verticals and different platforms at different rates. I believe we will be talking about the transition to cloud computing for the next 15 years.
Next time: more from Al on cloud computing and the impact of the economic meltdown.
Thursday, December 11, 2008
Marking green IT's progress: 'it's been slow going'
Posted by
Jay Fry
at
10:22 PM
Two things got me thinking about the state of green IT today. First, with word "leaking" out about President-elect Obama's choice for energy secretary, I started wondering what sort of, um, change might be ahead in how the government will try to shape the energy footprint of data centers. From the energy-efficient data center work we've been involved with over the past 18 months, I'm hoping any actions build on the work that Andrew Fanara, the EnergyStar folks, the EPA, the DoE, and the like have been pushing.
Second, I saw mention of an IDC report yesterday at GreenComputing, predicting it will be a good year for green IT. That's despite the global economic worries. Energy efficiency work in IT will sneak in the data center's back door, according to Frank Gens of IDC, as "cost cutting" as companies look to adopt new technologies only if they have a speedy payback.
Both of those things led me to this question: what kind of progress (or lack thereof?) in data center energy efficiency have we seen in the past 12-18 months?
The Gartner Data Center Conference last week provided some good fodder for a report card of sorts. In fact, they had analyst Paul McGuckin, PG&E energy-efficiency guru Mark Bramfitt, and VMware data center expert Mark Thiele onstage for a panel that pretty much summed it all up in my mind. Some highlights:
"It's been slow going." Bramfitt kicked off his comments acknowledging that the incentive programs he's been running in Northern California for PG&E (the benchmark for many other programs nationwide) have only crept along, even though he has 25 things he can pay end users for. Most, said Bramfitt, are related to data center air conditioning, and not related to IT. Yet.
Despite the slow start, Bramfitt's asked his bosses for $50 million to hand out over the next three years, and was legitimately proud of a $1.4 million check he was presenting to a customer this week (he didn't name them during the panel, but I'm betting it was NetApp based on this story). VMware's Thiele noted that PG&E can actually be pretty flexible in dreaming up ways to encourage data center energy savings. He mentioned creating two completely new incentive programs in conjunction with PG&E in his current job and with previous employers.
People are realizing that energy efficiency is more about data center capacity than cost. Bramfitt noted that the real value of the incentive check that utilities can provide back to end users is not the money. It's the ability to "wave that check around saying, 'Look, PG&E paid me to do the right thing.'" More interestingly, though, "the people in my programs understand that energy efficiency has value for capacity planning and growth," Bramfitt said. "It's a capacity issue. It's not financial."
IT and facilities are only just starting to work together. Gartner's McGuckin said these two groups are still pretty separate. That matches data we published from our energy-efficiency survey from earlier this year. "I'm convinced," said McGuckin, "that we're not going to solve the energy problem in the data center without these two groups coming together."
Thiele talked about an idea that I've heard of being tried by a few companies: a "bridging-the-gap" person -- a data center energy efficiency manager -- that sits between IT and facilities. Thiele has someone doing exactly this working with him on VMware's R&D data centers. This is someone who looks at the data center as a system, an approach that, based on what our customers tell us, really makes a lot of sense. "So far it has been really, really well received," he said.
Mythbusting takes a long time. We started a page on our website of server power management myths back in late summer 2007. One of the first objections we encountered then was that it's bad to turn off servers. We still hit this objection over a year later. And I expect we will continue to for a while yet (old habits die hard!). At one point in the Gartner panel, McGuckin playfully asked Thiele about a power management comment he made: "You're stepping on another myth -- you're not suggesting shutting down servers, are you?"
"I am," Thiele said.
To help Thiele make his point, McGuckin then quoted Intel saying that servers can handle 3,000 power cycles, which translates to an 8-10 year life cycle for most machines if you turn them off once a day (Thiele noted that most get replaced in 3-5 years anyway). We're glad to help these guys do a bit of mythbusting, but I can tell you, it’s not a short-term project. "Things I considered to be gospel two years ago are being heard by people as new today," said Thiele. Having Gartner and other thought leaders continue to add their voice to this discussion can't hurt.
The best thing to do? Nibble at the problem. In the end, suggested Thiele, you need to work on energy efficiency in stages. Maybe that's a metaphor for the progress green IT is making overall. "Nibble at this," he said. "Don’t try to eat the whole whale at once. Pick a couple things you can get some traction on. Get some quick wins."
I know we've certainly adjusted our approach to help customers start small and even set their projects up as self-funding whenever possible. Quick, high-return "nibbles" are probably the best single way to make the case for energy-efficient IT -- and to ensure that next year shows a lot broader success throughout the industry than we've seen in the year that's gone by.
Second, I saw mention of an IDC report yesterday at GreenComputing, predicting it will be a good year for green IT. That's despite the global economic worries. Energy efficiency work in IT will sneak in the data center's back door, according to Frank Gens of IDC, as "cost cutting" as companies look to adopt new technologies only if they have a speedy payback.
Both of those things led me to this question: what kind of progress (or lack thereof?) in data center energy efficiency have we seen in the past 12-18 months?
The Gartner Data Center Conference last week provided some good fodder for a report card of sorts. In fact, they had analyst Paul McGuckin, PG&E energy-efficiency guru Mark Bramfitt, and VMware data center expert Mark Thiele onstage for a panel that pretty much summed it all up in my mind. Some highlights:
"It's been slow going." Bramfitt kicked off his comments acknowledging that the incentive programs he's been running in Northern California for PG&E (the benchmark for many other programs nationwide) have only crept along, even though he has 25 things he can pay end users for. Most, said Bramfitt, are related to data center air conditioning, and not related to IT. Yet.
Despite the slow start, Bramfitt's asked his bosses for $50 million to hand out over the next three years, and was legitimately proud of a $1.4 million check he was presenting to a customer this week (he didn't name them during the panel, but I'm betting it was NetApp based on this story). VMware's Thiele noted that PG&E can actually be pretty flexible in dreaming up ways to encourage data center energy savings. He mentioned creating two completely new incentive programs in conjunction with PG&E in his current job and with previous employers.
People are realizing that energy efficiency is more about data center capacity than cost. Bramfitt noted that the real value of the incentive check that utilities can provide back to end users is not the money. It's the ability to "wave that check around saying, 'Look, PG&E paid me to do the right thing.'" More interestingly, though, "the people in my programs understand that energy efficiency has value for capacity planning and growth," Bramfitt said. "It's a capacity issue. It's not financial."
IT and facilities are only just starting to work together. Gartner's McGuckin said these two groups are still pretty separate. That matches data we published from our energy-efficiency survey from earlier this year. "I'm convinced," said McGuckin, "that we're not going to solve the energy problem in the data center without these two groups coming together."
Thiele talked about an idea that I've heard of being tried by a few companies: a "bridging-the-gap" person -- a data center energy efficiency manager -- that sits between IT and facilities. Thiele has someone doing exactly this working with him on VMware's R&D data centers. This is someone who looks at the data center as a system, an approach that, based on what our customers tell us, really makes a lot of sense. "So far it has been really, really well received," he said.
Mythbusting takes a long time. We started a page on our website of server power management myths back in late summer 2007. One of the first objections we encountered then was that it's bad to turn off servers. We still hit this objection over a year later. And I expect we will continue to for a while yet (old habits die hard!). At one point in the Gartner panel, McGuckin playfully asked Thiele about a power management comment he made: "You're stepping on another myth -- you're not suggesting shutting down servers, are you?"
"I am," Thiele said.
To help Thiele make his point, McGuckin then quoted Intel saying that servers can handle 3,000 power cycles, which translates to an 8-10 year life cycle for most machines if you turn them off once a day (Thiele noted that most get replaced in 3-5 years anyway). We're glad to help these guys do a bit of mythbusting, but I can tell you, it’s not a short-term project. "Things I considered to be gospel two years ago are being heard by people as new today," said Thiele. Having Gartner and other thought leaders continue to add their voice to this discussion can't hurt.
The best thing to do? Nibble at the problem. In the end, suggested Thiele, you need to work on energy efficiency in stages. Maybe that's a metaphor for the progress green IT is making overall. "Nibble at this," he said. "Don’t try to eat the whole whale at once. Pick a couple things you can get some traction on. Get some quick wins."
I know we've certainly adjusted our approach to help customers start small and even set their projects up as self-funding whenever possible. Quick, high-return "nibbles" are probably the best single way to make the case for energy-efficient IT -- and to ensure that next year shows a lot broader success throughout the industry than we've seen in the year that's gone by.
Subscribe to:
Posts (Atom)