Often, IT considers cloud computing for things you were doing anyway, with the hope of doing them much cheaper. Or, more likely from what I’ve heard, much faster. But last week a customer reminded me about one of the more important implications of cloud: you can do things you really wouldn’t have done otherwise.
And what a big, positive benefit for your IT operations that can be.
A customer’s real, live cloud computing experience at Gartner Symposium
The occasion was last week’s Gartner’s Symposium and ITXpo conference in Orlando. I sat in on the CA Technologies session that featured Adam Famularo, the general manager of our cloud business (and my boss), and David Guthrie, CTO of Premiere Global Services, Inc. (PGi). It probably isn’t a surprise that Guthrie’s real-world experiences with cloud computing provided some really interesting tidbits of advice.
Guthrie talked about how PGi, a $600 million real-time collaboration business that provides audio & web conferencing, has embraced cloud computing (you’ve used their service if you’ve ever heard “Welcome to Ready Conference” and other similar conferencing and collaboration system greetings).
What kicked off cloud computing for PGi? Frustration. Guthrie told the PGi story like this: a developer was frustrated with the way PGi previously went about deploying business services. From his point of view, it was way too time-consuming: each new service required a procurement process, the set-up of new servers at a co-location facility, and the like. That developer tracked down 3Tera AppLogic (end of sales pitch) and PGi began to put it to use as a deployment platform for cloud services.
What does PGi use cloud computing for? Well, everything that’s customer facing. “All the services that we deliver for our customers are in the cloud,” said Guthrie. That means they use a cloud infrastructure for their audio and web collaboration meeting services. They use it for their pgi.com web site, sales gateways, and customer-specific portals as well.
Guthrie stressed the benefits of speed that come from cloud computing. “It’s all about getting technology to our customers faster,” said Guthrie. “Ramp time is one of the biggest benefits. It’s about delivering our services in a more effective way – not just from the cost perspective, but also from the time perspective.”
Cloud computing changes application development and deployment. Cloud, Guthrie noted in his presentation, “really changed out entire app dev cycle. We now have developers closer to how things are being deployed.” He said it helped fix consistency problems between applications across dev, QA, and production.
During his talk, Guthrie pointed to some more fundamental differences they've been seeing, too. He described how the cloud-based approach has been helping his team be more effective. “I’ve definitely saved money by getting things to market quicker,” Guthrie said.
But, more importantly, said Guthrie, “it makes you make better applications.” “I was also able to add redundancy where I previously wouldn’t have considered it” thanks to both cost efficiencies and the ease with which cloud infrastructure (at least, in his CA 3Tera AppLogic-based environment) can be built up and torn down.
“You can test scenarios that in the past were really impractical,” said Guthrie. He cited doing side-by-side testing of ideas and configurations that never would have been possible before to see what makes the most sense, rolling out the one that worked best, and discarding those that didn't. Except in this case, "discarding" doesn't mean tossing out dedicated silos of hardware (and hard-wired) infrastructure that you had to manually set up just for this purpose.
Practical advice, please: what apps are cloud-ready?
At the Gartner session, audience members were very curious about the practicalities of using cloud computing, asking Guthrie to make some generalizations from his experience on everything from what kind of applications work best in a cloud environment, to how he won over his IT operations staff. (I don’t think last week’s audience is unique in wanting to learn some of these things.)
“Some apps aren’t ready for this kind of infrastructure,” said Guthrie, while “some are perfectly set for this.” And, unlike the assumption many in the room expressed, it’s not generally true that packaged apps are more appropriate to bring to a cloud environment than home-grown applications.
Guthrie’s take? You should decide which apps to take to the cloud not by whether they are packaged or custom, but with a different set of criteria. For example, stateless applications that live on the web are perfect for this kind of approach, he believes.
Dealing with skepticism about cloud computing from IT operations
One of the more interesting questions brought out what Guthrie learned about getting the buy-in and support of the operations team at PGi. “I don’t want you to think I’ve deployed these without a lot of resistance. I don’t want to act like this was just easy. There were people that really resisted it,” he said.
Earning the buy-in of his IT operations people required training for the team, time, and even a quite a bit of evangelism, according to Guthrie. He also laid down an edict at one point: for everything new, he required that “you have to tell me why it shouldn’t go on the cloud infrastructure.” What seemed draconian at first, actually turned out to be the right strategic choice.
“As the ops team became familiar [with cloud computing], they began to embrace it,” Guthrie said.
Have you had similar real-world experiences with the cloud? Or even contradictory ones? I’m eager to hear your thoughts on PGi’s approach and advice (as you might guess, I much prefer real-world discussions to theoretical ones any day of the week). Leave comments here or contact me directly; I’ll definitely continue this discussion in future posts.
Tuesday, October 26, 2010
Wednesday, October 13, 2010
The first 200 servers are the easy part: private cloud advice and why IT won’t lose jobs to the cloud
Posted by
Jay Fry
at
9:48 PM
The recent CIO.com webcast that featured Bert Armijo of CA Technologies and James Staten of Forrester Research offered some glimpses into the state of private clouds in large enterprises at the moment. I heard both pragmatism and some good, old-fashioned optimism -- even when the topic turned to the impact of cloud computing on IT jobs.
Here are some highlights worth passing on, including a few juicy quotes (always fun):
Cloud has executive fans, and cloud decisions are being made at a relatively high level. In the live polling we did during the webcast, we asked who was likely to be the biggest proponent of cloud computing in attendees’ organizations. 53% said it was their CIO or senior IT leadership. 23% said it was the business executives. Forrester’s James Staten interpreted this to mean that business folks are demanding answers, often leaning toward the cloud, and the senior IT team is working quickly to bring solutions to the table, often including the cloud as a key piece. I suppose you could add: “whether they wanted to or not.”
Forrester’s Staten gave a run-down of why many organizations aren’t ready for an internal cloud – but gave lots of tips for changing that. If you’ve read James’ paper on the topic of private cloud readiness (reg required), you’ve heard a lot of these suggestions. There were quite a few new tidbits, however:
· On creating a private cloud: “It’s not as easy as setting up a VMware environment and thinking you’re done.” Even if this had been anyone’s belief at one point, I think the industry has matured enough (as have cloud computing definitions) for it not to be controversial any more. Virtualization is a good step on the way, but isn’t the whole enchilada.
· “Sharing is not something that organizations are good at.” James is right on here. I think we all learned this on the playground early in life, but it’s still true in IT. IT’s silos aren’t conducive to sharing things. James went farther, actually, and said, “you’re not ready for private cloud if you have separate virtual resource pools for marketing…and HR…and development.” Bottom line: the silos have got to go.
· So what advice did James give for IT organizations to help speed their move to private clouds? One thing they can do is “create a new desired state with separate resources, that way you can start learning from that [cloud environment].” Find a way to deliver a private cloud quickly (I can think of at least one).
· James also noted that “a private cloud doesn’t have to be something you build.” You can use a hosted “virtual private cloud” from a service provider like Layered Tech. Bert Armijo, the CA Technologies expert on the webcast, agreed. “Even large customers start with resources in hosting provider data centers.” Enterprises with CA 3Tera AppLogic running at their service provider and internally can then move applications to whichever location makes the most sense at a given point in time, said Armijo.
· What about “cloud-in-a-box” solutions? James was asked for his take. “Cloud-in-a-box is something you should learn from, not take apart,” he said. “The degree of completeness varies dramatically. And the way in which it suits your needs will vary dramatically as well.”
The biggest cloud skeptics were cited as – no surprise – the security and compliance groups within IT, according to the polling. This continues to be a common theme, but shouldn’t be taken as a reason to toss the whole idea of cloud computing out, emphasized Staten. “Everyone loves to hold up the security flag and stop things from happening in the organization.” But don’t let them. It’s too easy to use it as an excuse for not doing something that could be very useful to your organization.
Armijo also listed several tips for finding successful starting points in the move to creating a private cloud. It was all about pragmatic first steps, in Bert’s view. “The first 200 servers are the easy part,” said Armijo. “Because you can get a 50-server cloud up doesn’t mean you have conquered cloud.” His suggestions:
- Start where value outweighs the perceived risk of cloud computing for your organization (and it will indeed be different for each organization)
- Find places where you will have quick, repeated application or stack usage
- If you’re more on the bleeding edge, set up IT as an internal service provider to the various parts of the business. It’s more challenging, for sure, but there are (large) companies doing this today, and it will make profound improvements to IT’s service delivery.
Will cloud computing eliminate jobs? A bit of Armijo’s optimism was in evidence here: he said, in a word, no. “Every time we hit an efficiency wall, we never lose jobs,” he said. “We may reshuffle them. That will be true for clouds as well.” He believed more strategic roles will grow out of any changes that come as a result of the impact of cloud on IT.
“IT people are the most creative people on the face of the planet,” said Armijo. “Most of us got into IT because we like solving problems. That’s what cloud’s going to do – it’s going to let our creative juices flow.”
If you’re interested in listening to the whole webcast, which was moderated by Jim Malone, editorial director at IDG, you can sign up here for an on-demand, encore performance.
Here are some highlights worth passing on, including a few juicy quotes (always fun):
Cloud has executive fans, and cloud decisions are being made at a relatively high level. In the live polling we did during the webcast, we asked who was likely to be the biggest proponent of cloud computing in attendees’ organizations. 53% said it was their CIO or senior IT leadership. 23% said it was the business executives. Forrester’s James Staten interpreted this to mean that business folks are demanding answers, often leaning toward the cloud, and the senior IT team is working quickly to bring solutions to the table, often including the cloud as a key piece. I suppose you could add: “whether they wanted to or not.”
Forrester’s Staten gave a run-down of why many organizations aren’t ready for an internal cloud – but gave lots of tips for changing that. If you’ve read James’ paper on the topic of private cloud readiness (reg required), you’ve heard a lot of these suggestions. There were quite a few new tidbits, however:
· On creating a private cloud: “It’s not as easy as setting up a VMware environment and thinking you’re done.” Even if this had been anyone’s belief at one point, I think the industry has matured enough (as have cloud computing definitions) for it not to be controversial any more. Virtualization is a good step on the way, but isn’t the whole enchilada.
· “Sharing is not something that organizations are good at.” James is right on here. I think we all learned this on the playground early in life, but it’s still true in IT. IT’s silos aren’t conducive to sharing things. James went farther, actually, and said, “you’re not ready for private cloud if you have separate virtual resource pools for marketing…and HR…and development.” Bottom line: the silos have got to go.
· So what advice did James give for IT organizations to help speed their move to private clouds? One thing they can do is “create a new desired state with separate resources, that way you can start learning from that [cloud environment].” Find a way to deliver a private cloud quickly (I can think of at least one).
· James also noted that “a private cloud doesn’t have to be something you build.” You can use a hosted “virtual private cloud” from a service provider like Layered Tech. Bert Armijo, the CA Technologies expert on the webcast, agreed. “Even large customers start with resources in hosting provider data centers.” Enterprises with CA 3Tera AppLogic running at their service provider and internally can then move applications to whichever location makes the most sense at a given point in time, said Armijo.
· What about “cloud-in-a-box” solutions? James was asked for his take. “Cloud-in-a-box is something you should learn from, not take apart,” he said. “The degree of completeness varies dramatically. And the way in which it suits your needs will vary dramatically as well.”
The biggest cloud skeptics were cited as – no surprise – the security and compliance groups within IT, according to the polling. This continues to be a common theme, but shouldn’t be taken as a reason to toss the whole idea of cloud computing out, emphasized Staten. “Everyone loves to hold up the security flag and stop things from happening in the organization.” But don’t let them. It’s too easy to use it as an excuse for not doing something that could be very useful to your organization.
Armijo also listed several tips for finding successful starting points in the move to creating a private cloud. It was all about pragmatic first steps, in Bert’s view. “The first 200 servers are the easy part,” said Armijo. “Because you can get a 50-server cloud up doesn’t mean you have conquered cloud.” His suggestions:
- Start where value outweighs the perceived risk of cloud computing for your organization (and it will indeed be different for each organization)
- Find places where you will have quick, repeated application or stack usage
- If you’re more on the bleeding edge, set up IT as an internal service provider to the various parts of the business. It’s more challenging, for sure, but there are (large) companies doing this today, and it will make profound improvements to IT’s service delivery.
Will cloud computing eliminate jobs? A bit of Armijo’s optimism was in evidence here: he said, in a word, no. “Every time we hit an efficiency wall, we never lose jobs,” he said. “We may reshuffle them. That will be true for clouds as well.” He believed more strategic roles will grow out of any changes that come as a result of the impact of cloud on IT.
“IT people are the most creative people on the face of the planet,” said Armijo. “Most of us got into IT because we like solving problems. That’s what cloud’s going to do – it’s going to let our creative juices flow.”
If you’re interested in listening to the whole webcast, which was moderated by Jim Malone, editorial director at IDG, you can sign up here for an on-demand, encore performance.
Tuesday, October 12, 2010
Delivering a cloud by Tuesday
Posted by
Jay Fry
at
10:26 PM
Much of what I heard at VMworld in San Francisco (and is likely being repeated for the lucky folks enjoying themselves this week in Copenhagen) was about the long, gradual path to cloud computing. Lauren Horwitz captured this sentiment well in this VMworld Europe preview at SearchServerVirtualization.
And I totally agree: cloud computing is a long-term shift that will require time to absorb and come to terms with, while taking the form of public and private clouds – and even becoming a hybrid mix of the two. How the IT department in the largest organizations will assess and incorporate these new, more dynamic ways of operating IT is still getting sorted out.
Mike Laverick, the virtualization expert and the RTFM blogger quoted in the SearchServerVirtualization story, said that “users shouldn’t be scared thinking that they have to deliver a cloud by Tuesday of next week. It’s a long-term operational model over a 5- or 10-year period.”
But what if you can’t wait that long to deliver?
What if you really do have to deliver a cloud (or at least an application) next Tuesday? And then what if that app is expected to handle drastic changes the following Thursday?
Despite the often-repeated “slow and steady” messages about IT infrastructure evolution, I think it’s worth making sure we all don’t forget that there’s another choice, too. After all, cloud computing is all about choices, right? And cloud computing is about using IT to make an immediate impact on your business. While there is a time and place for the slow, steady, incremental change, there’s also a very real need now and again to make a big leap forward.
There are times to throw incrementalism out the window. In those situations, you actually do have another choice: getting a private cloud set up in record time with a more turnkey-type approach so you can immediately deliver the application or service you’re being asked for.
Turnkey cloud platform trade-offs
A turnkey approach for a cloud platform (which is, in the spirit of full disclosure, the approach that our CA 3Tera AppLogic takes) can get you the speedy delivery times you’re looking for. Of course, the key to speed is being able to have all of the complicating factors under your control. The 3Tera product does this, for example, by creating a virtual appliance out of the entire stack: from infrastructure on up to the application, simplifying things immensely by turning normally complicated components into software-only versions that can be easily controlled.
A turnkey cloud is probably best suited for situations where the quick delivery of the application or service is more critical than following existing infrastructure and operations procedures. After all, those procedures are the things that are keeping you from delivering in the first place. So there’s your trade-off: you can give yourself the ability to solve some of the messier infrastructure problems by changing some of the rules. And processes. The good news (for CA 3Tera AppLogic customers, anyway) is that even when you do break a few rules there’s a nice way to link everything back to your existing environment as that becomes important (and, eventually, required).
Create a real-world example of what your data center is evolving into
For these types of (often greenfield) situations, a turnkey private cloud platform gives you the chance to set up a small private cloud environment that delivers exactly what you’re looking for at the moment, which is the critical thing. It also sets you up for dramatic changes in requirements as conditions vary.
But beyond solving the immediate crisis (and preparing you for the next one), there’s a hidden benefit that shouldn’t be overlooked: you get experience working with your apps and infrastructure in this future, more cloud-like state. You get firsthand knowledge of the type of environment you’d like your entire data center to evolve into over the next 5-10 years. You’ll find out the things that work well. You’ll find out what the pitfalls are. Most of all, you’ll get comfortable with things working differently and learn the implications of that. That’s useful for the technology, but also for the previously mentioned processes, and, most of all, for the IT people involved.
So, when is that “big leap” appropriate?
The Forrester research by James Staten that I’ve referred to in previous posts talks about the different paths for moving toward a cloud-based environment – one more incremental and the other more turnkey. I’m betting you’ll know when you’re in the scenarios appropriate for each.
Service providers (including some of those working with us that I mentioned in my last post) have been living in this world for a few years now. They are very obviously eager to try new approaches. They are looking for real, game-changing cloud platform solutions upon which they can build their next decade of business. And the competitive pressures on them are enormous.
If you’re in an end user IT organization that’s facing an “uh oh” moment -- where you know what you want to get done, but you don’t really see how you can get there from here – it’s probably worth exploring this leap. At least in a small corner of your environment.
After all, Tuesday’s just around the corner.
And I totally agree: cloud computing is a long-term shift that will require time to absorb and come to terms with, while taking the form of public and private clouds – and even becoming a hybrid mix of the two. How the IT department in the largest organizations will assess and incorporate these new, more dynamic ways of operating IT is still getting sorted out.
Mike Laverick, the virtualization expert and the RTFM blogger quoted in the SearchServerVirtualization story, said that “users shouldn’t be scared thinking that they have to deliver a cloud by Tuesday of next week. It’s a long-term operational model over a 5- or 10-year period.”
But what if you can’t wait that long to deliver?
What if you really do have to deliver a cloud (or at least an application) next Tuesday? And then what if that app is expected to handle drastic changes the following Thursday?
Despite the often-repeated “slow and steady” messages about IT infrastructure evolution, I think it’s worth making sure we all don’t forget that there’s another choice, too. After all, cloud computing is all about choices, right? And cloud computing is about using IT to make an immediate impact on your business. While there is a time and place for the slow, steady, incremental change, there’s also a very real need now and again to make a big leap forward.
There are times to throw incrementalism out the window. In those situations, you actually do have another choice: getting a private cloud set up in record time with a more turnkey-type approach so you can immediately deliver the application or service you’re being asked for.
Turnkey cloud platform trade-offs
A turnkey approach for a cloud platform (which is, in the spirit of full disclosure, the approach that our CA 3Tera AppLogic takes) can get you the speedy delivery times you’re looking for. Of course, the key to speed is being able to have all of the complicating factors under your control. The 3Tera product does this, for example, by creating a virtual appliance out of the entire stack: from infrastructure on up to the application, simplifying things immensely by turning normally complicated components into software-only versions that can be easily controlled.
A turnkey cloud is probably best suited for situations where the quick delivery of the application or service is more critical than following existing infrastructure and operations procedures. After all, those procedures are the things that are keeping you from delivering in the first place. So there’s your trade-off: you can give yourself the ability to solve some of the messier infrastructure problems by changing some of the rules. And processes. The good news (for CA 3Tera AppLogic customers, anyway) is that even when you do break a few rules there’s a nice way to link everything back to your existing environment as that becomes important (and, eventually, required).
Create a real-world example of what your data center is evolving into
For these types of (often greenfield) situations, a turnkey private cloud platform gives you the chance to set up a small private cloud environment that delivers exactly what you’re looking for at the moment, which is the critical thing. It also sets you up for dramatic changes in requirements as conditions vary.
But beyond solving the immediate crisis (and preparing you for the next one), there’s a hidden benefit that shouldn’t be overlooked: you get experience working with your apps and infrastructure in this future, more cloud-like state. You get firsthand knowledge of the type of environment you’d like your entire data center to evolve into over the next 5-10 years. You’ll find out the things that work well. You’ll find out what the pitfalls are. Most of all, you’ll get comfortable with things working differently and learn the implications of that. That’s useful for the technology, but also for the previously mentioned processes, and, most of all, for the IT people involved.
So, when is that “big leap” appropriate?
The Forrester research by James Staten that I’ve referred to in previous posts talks about the different paths for moving toward a cloud-based environment – one more incremental and the other more turnkey. I’m betting you’ll know when you’re in the scenarios appropriate for each.
Service providers (including some of those working with us that I mentioned in my last post) have been living in this world for a few years now. They are very obviously eager to try new approaches. They are looking for real, game-changing cloud platform solutions upon which they can build their next decade of business. And the competitive pressures on them are enormous.
If you’re in an end user IT organization that’s facing an “uh oh” moment -- where you know what you want to get done, but you don’t really see how you can get there from here – it’s probably worth exploring this leap. At least in a small corner of your environment.
After all, Tuesday’s just around the corner.
Monday, October 4, 2010
New CA 3Tera release: Extending cloud innovations while answering enterprise & MSP requirements
Posted by
Jay Fry
at
5:38 AM
Today CA Technologies announced the first GA release of CA 3Tera AppLogic.
Now, obviously, it’s not the first release of the 3Tera product. That software, which builds and runs public and private clouds, is well known in its space and has been in the market for several years now. In fact, CA 3Tera AppLogic has 80+ customers spread across 4 continents and has somewhere in the neighborhood of 400 clouds currently in use, some of which have been running continuously for years. I mention a few of the service provider customers in particular at the end of this post.
However, this will be the first GA release of the product since being acquired back in the spring – the first release under the CA Technologies banner. A lot of acquisitions in this industry never make it this far. From my perspective, this one seems to be working for a few reasons:
Customers need different approaches to cloud computing.
We’re finding that customer approaches to adopting some form of cloud computing – especially for private clouds – usually take one of two paths. I talked about this in a previous post: one path involves incrementally building on your existing virtualization and automation efforts in an attempt to make your data center infrastructure much more dynamic. That one is more evolutionary. The second path is much more revolutionary. That path is one where competitive pressure is high or the financial squeeze is on. Customers in that situation are looking for a more turnkey cloud approach, one that abstracts away a lot of the technical and process complexity. They are looking for a solution that gets them to cloud fast.
That last point is the key. A more turnkey approach like CA 3Tera AppLogic is going to be appealing when the delivery time or cost structure of the more slow & steady “evolutionary” approach is going to jeopardize success. That faster path is especially appropriate for managed service providers these days, given the rate of change and turbulence in the market (and the product’s partner & customer list proves that).
Now, having this more “revolutionary” approach in our portfolio alongside things like CA Spectrum Automation Manager lets us talk with customers about both paths. We get to ask what customers want, and then work with them on whichever approach makes sense for their current project. Truth is, they may take one path for one project, and another path for the next. And, as Forrester’s James Staten mentioned in a webcast on CIO.com about private clouds (that CA sponsored) last week, quickly getting at least some part of your environment to its new, (cloudy) desired state lets you learn quite a bit for your IT environment as a whole.
The 3Tera cloud platform innovations remain, well…innovative.
The other reason (I think) that you’re seeing progress from the combination of CA Technologies and 3Tera is actually pretty basic: even after all this time, no one else is doing what this platform can do. The CA 3Tera AppLogic cloud platform takes an application-centric approach. This means that IT is able to shift its time, effort, and focus away from the detailed technology components, and instead focus on what’s important – the business service that they are trying to deliver.
This application focus is different. Most of what you see in the market doesn’t look at things this way. It is either an incremental addition to the way IT has been managing and improving virtualization, or focuses at a level much lower than the application. Or both.
Plus, if you’ve ever seen a demo of CA 3Tera AppLogic, you’ll also be struck by the simplicity that the platform’s very visual interface brings you. You draw up the application and infrastructure you want to deploy and it packages this all up as a virtual appliance. That virtual package can then be deployed or destroyed as appropriate. When you need something, you just draw it in. Need more server resources or load balancers? Drag them in from your palette and drop them onto the canvas. (Here’s a light-hearted video that explains all this pretty simply in a couple of minutes.)
Keeping this sort of innovative technology, which was ahead of its time when it came out, out in front years later is really important, and a big focus for CA Technologies. The market is changing rapidly; we have to continue to lead the charge for customers.
Finding a way to meld innovation with extended enterprise-level & MSP requirements
I think it’s really good news that many of the new capabilities that CA Technologies is bringing to market with CA 3Tera AppLogic 2.9 are about making the product even more suitable as the backbone of an enterprise’s private cloud environment, or the underpinnings of a service provider’s cloud offerings. Things like high availability, improved networking, and new web services APIs should make a lot of the bigger organizations very happy.
Already, we’ve seen an accelerated eagerness by large enterprises to hear about what CA 3Tera AppLogic is and what it can do since it became part of CA Technologies. Maybe it’s the continued rough economic waters, but the big name standing behind 3Tera now goes a long way, from what I’ve heard and seen of customer reaction. Let’s just say, the guys out talking about this stuff are busy.
There are customers doing this today. Lots of them.
The true test of whether the timing is right, the technology is right, and the value proposition makes sense is, well, whether people are buying and then really using your solution. 3Tera started out seeing success with smaller MSPs. That success continues. The enterprises have moved slower, but leading-edge folks are making moves now. The enterprise use cases are very interesting, actually. Anything that involves spiky workloads, where standardized stacks need to be set up and torn down quickly is ideal. I’m going to try to get some interviews for the blog with a couple of these folks when they are ready to talk.
There are some excellent MSPs and other service providers, on the other hand, that are out there offering CA 3Tera AppLogic-based services today and shaking things up.
“Gone are the days of toiling in the depths of the data center,” said David Corriveau, CTO of Radix Cloud, “where we’d have to waste the efforts of smart IT pros to manually configure every piece of hardware for each new customer.” Now Radix Cloud centralizes their technical expertise in a single location with the CA 3Tera AppLogic product, which they say improves security and reliability.
I talked to Mike Michalik, CEO of Cirrhus9, and they use the product to quickly stand up new applications and websites for clients of theirs, with robustness and product maturity being key ingredients for success.
Berlin-based ScaleUp Technologies is also knee-deep delivering cloud services to customers. CEO Kevin Dykes said (my summer vacation plans actually enabled me to make an in-person visit to their offices) that a turnkey cloud platform means they, as a service provider, have a strong platform to start from, but can still offer differentiated services. “Our customers directly benefit, too,” said Dykes. “They are able to focus on rapid business innovation, quickly and easily moving applications from dev/test to production, or adding capacity during spikes in demand.”
Jonathan Davis, CTO of DNS Europe, will be co-presenting with my colleague Matt Richards at VMworld Europe in Copenhagen, talking about how he’s been able to change the way they do business. “Very quickly,” said Davis, “we have been able to ramp up new revenue streams and win new customers with even higher expectations and more complex needs.”
I find the user stories to be some of the most interesting pieces of this – and one of the things that is so hard to find in the current industry dialog about cloud computing. So, now that we have CA 3Tera AppLogic announced, I’ll be interviewing a few of these folks to hear firsthand what’s working – and what’s not – in their view of cloud computing. Check back here for more details.
Now, obviously, it’s not the first release of the 3Tera product. That software, which builds and runs public and private clouds, is well known in its space and has been in the market for several years now. In fact, CA 3Tera AppLogic has 80+ customers spread across 4 continents and has somewhere in the neighborhood of 400 clouds currently in use, some of which have been running continuously for years. I mention a few of the service provider customers in particular at the end of this post.
However, this will be the first GA release of the product since being acquired back in the spring – the first release under the CA Technologies banner. A lot of acquisitions in this industry never make it this far. From my perspective, this one seems to be working for a few reasons:
Customers need different approaches to cloud computing.
We’re finding that customer approaches to adopting some form of cloud computing – especially for private clouds – usually take one of two paths. I talked about this in a previous post: one path involves incrementally building on your existing virtualization and automation efforts in an attempt to make your data center infrastructure much more dynamic. That one is more evolutionary. The second path is much more revolutionary. That path is one where competitive pressure is high or the financial squeeze is on. Customers in that situation are looking for a more turnkey cloud approach, one that abstracts away a lot of the technical and process complexity. They are looking for a solution that gets them to cloud fast.
That last point is the key. A more turnkey approach like CA 3Tera AppLogic is going to be appealing when the delivery time or cost structure of the more slow & steady “evolutionary” approach is going to jeopardize success. That faster path is especially appropriate for managed service providers these days, given the rate of change and turbulence in the market (and the product’s partner & customer list proves that).
Now, having this more “revolutionary” approach in our portfolio alongside things like CA Spectrum Automation Manager lets us talk with customers about both paths. We get to ask what customers want, and then work with them on whichever approach makes sense for their current project. Truth is, they may take one path for one project, and another path for the next. And, as Forrester’s James Staten mentioned in a webcast on CIO.com about private clouds (that CA sponsored) last week, quickly getting at least some part of your environment to its new, (cloudy) desired state lets you learn quite a bit for your IT environment as a whole.
The 3Tera cloud platform innovations remain, well…innovative.
The other reason (I think) that you’re seeing progress from the combination of CA Technologies and 3Tera is actually pretty basic: even after all this time, no one else is doing what this platform can do. The CA 3Tera AppLogic cloud platform takes an application-centric approach. This means that IT is able to shift its time, effort, and focus away from the detailed technology components, and instead focus on what’s important – the business service that they are trying to deliver.
This application focus is different. Most of what you see in the market doesn’t look at things this way. It is either an incremental addition to the way IT has been managing and improving virtualization, or focuses at a level much lower than the application. Or both.
Plus, if you’ve ever seen a demo of CA 3Tera AppLogic, you’ll also be struck by the simplicity that the platform’s very visual interface brings you. You draw up the application and infrastructure you want to deploy and it packages this all up as a virtual appliance. That virtual package can then be deployed or destroyed as appropriate. When you need something, you just draw it in. Need more server resources or load balancers? Drag them in from your palette and drop them onto the canvas. (Here’s a light-hearted video that explains all this pretty simply in a couple of minutes.)
Keeping this sort of innovative technology, which was ahead of its time when it came out, out in front years later is really important, and a big focus for CA Technologies. The market is changing rapidly; we have to continue to lead the charge for customers.
Finding a way to meld innovation with extended enterprise-level & MSP requirements
I think it’s really good news that many of the new capabilities that CA Technologies is bringing to market with CA 3Tera AppLogic 2.9 are about making the product even more suitable as the backbone of an enterprise’s private cloud environment, or the underpinnings of a service provider’s cloud offerings. Things like high availability, improved networking, and new web services APIs should make a lot of the bigger organizations very happy.
Already, we’ve seen an accelerated eagerness by large enterprises to hear about what CA 3Tera AppLogic is and what it can do since it became part of CA Technologies. Maybe it’s the continued rough economic waters, but the big name standing behind 3Tera now goes a long way, from what I’ve heard and seen of customer reaction. Let’s just say, the guys out talking about this stuff are busy.
There are customers doing this today. Lots of them.
The true test of whether the timing is right, the technology is right, and the value proposition makes sense is, well, whether people are buying and then really using your solution. 3Tera started out seeing success with smaller MSPs. That success continues. The enterprises have moved slower, but leading-edge folks are making moves now. The enterprise use cases are very interesting, actually. Anything that involves spiky workloads, where standardized stacks need to be set up and torn down quickly is ideal. I’m going to try to get some interviews for the blog with a couple of these folks when they are ready to talk.
There are some excellent MSPs and other service providers, on the other hand, that are out there offering CA 3Tera AppLogic-based services today and shaking things up.
“Gone are the days of toiling in the depths of the data center,” said David Corriveau, CTO of Radix Cloud, “where we’d have to waste the efforts of smart IT pros to manually configure every piece of hardware for each new customer.” Now Radix Cloud centralizes their technical expertise in a single location with the CA 3Tera AppLogic product, which they say improves security and reliability.
I talked to Mike Michalik, CEO of Cirrhus9, and they use the product to quickly stand up new applications and websites for clients of theirs, with robustness and product maturity being key ingredients for success.
Berlin-based ScaleUp Technologies is also knee-deep delivering cloud services to customers. CEO Kevin Dykes said (my summer vacation plans actually enabled me to make an in-person visit to their offices) that a turnkey cloud platform means they, as a service provider, have a strong platform to start from, but can still offer differentiated services. “Our customers directly benefit, too,” said Dykes. “They are able to focus on rapid business innovation, quickly and easily moving applications from dev/test to production, or adding capacity during spikes in demand.”
Jonathan Davis, CTO of DNS Europe, will be co-presenting with my colleague Matt Richards at VMworld Europe in Copenhagen, talking about how he’s been able to change the way they do business. “Very quickly,” said Davis, “we have been able to ramp up new revenue streams and win new customers with even higher expectations and more complex needs.”
I find the user stories to be some of the most interesting pieces of this – and one of the things that is so hard to find in the current industry dialog about cloud computing. So, now that we have CA 3Tera AppLogic announced, I’ll be interviewing a few of these folks to hear firsthand what’s working – and what’s not – in their view of cloud computing. Check back here for more details.
Subscribe to:
Posts (Atom)