Showing posts with label SaaS. Show all posts
Showing posts with label SaaS. Show all posts

Wednesday, May 25, 2011

Hurford of DNS Europe: service providers and SaaS developers are showing enterprises how cloud is done

The antidote to cloud computing hype is, well, reality. For example: talking to people who are in the middle of working on cloud computing right now. We’ve started a whole new section of the ca.com website that provide profiles, videos, and other details of people doing exactly that.


One of the people highlighted on this list of Cloud Luminaries and Cloud Accelerators is Stephen Hurford of DNS Europe. DNS Europe is a London-based cloud hosting business with over 500 customers across the region. They have been taking the cloud-based business opportunities very seriously for several years now, and provide cloud application hosting and development, hybrid cloud integration services, plus consulting to help customers make the move to cloud.


I convinced Hurford, who serves as the cloud services director for DNS Europe, to share a few thoughts about what customers and service providers are – and should be – doing right now and some smart strategies he’d suggest.


Jay Fry, Data Center Dialog: From your experiences, Stephen, how should companies think about and prepare for cloud computing? Is there something enterprises can learn from service providers like yourself?


Stephen Hurford, DNS Europe: Picking the right project for cloud is very important. Launching into this saying “we’re going to convert everything” to the cloud in this short period of time is almost doomed. There is a relatively steep learning curve with it all.


But this is an area of opportunity for all of the service providers that have been using CA 3Tera AppLogic [like DNS Europe] for the last three years. We’re in a unique position to be able to help enterprises bypass the pitfalls that we had to climb out of.


Because the hosting model has become a much more accepted approach in general, enterprises are starting to look much more to service providers. They’re not necessarily looking to service providers to host their stuff for them, but to teach them about hosting, because that’s what their internal IT departments are becoming – hosting companies.


DCD: What is the physical infrastructure you use to deliver your services?

Stephen Hurford: We don’t have any data centers at all. We are a CSP that doesn’t believe in owning physical infrastructure apart from the rack inwards. We host with reputable Tier 3 partners like Level3, Telenor, and Interxion, but it means that we don’t care where the facility is. We can deploy a private cloud for a customer of ours within a data center anywhere in the world and with any provider.


DCD: When people talk about cloud, the public cloud is usually their first thought. There are some big providers in this market. How does the public cloud market look from your perspective as a smaller service provider? Is there a way for you to differentiate what you can do for a customer?


Stephen Hurford: The public cloud space is highly competitive – if you look at Amazon.com, GoGrid, Rackspace, the question is how can you compete in that market space as a small service provider? It’s almost impossible to compete on price, so don’t even try.


But, one thing that we have that Amazon and Rackspace and GoGrid do not have is they do not have is an up-sell product – they cannot take their customers from a public cloud to a private cloud product. So when their customers reach a point where they say, “Well, hang on, I want control of the infrastructure,” that’s not what you get from Amazon, Rackspace and GoGrid. From those guys you get an infrastructure that’s controlled by the provider. Because we use CA 3Tera AppLogic, the customer gets control, whether hosted by a service provider or by themselves internally.


DCD: My CA Technologies colleague Matt Richards has been blogging a bit about smart ways for MSPs to compete and win with so much disruption going on. Where do you recommend a service provider start if they want to get into the cloud services business today?


Stephen Hurford: My advice to service providers who are starting up is to begin with niche market targeting. Pick a specific service or an application or a target market and become very good at offering and supporting that.


We recommend starting at the top, by providing SaaS. SaaS is relatively straightforward to get up and running on AppLogic if you choose the right software. The templates already exist – they are in the catalog, they are available from other service providers, and soon will be available in a marketplace of applications. Delivering SaaS offerings is the easiest technical and learning overhead approach and that’s why we recommend it.


DCD: Looking at your customer base, who is taking the best advantage of cloud platform capabilities right now? Is the area you just mentioned – SaaS – where you are seeing a lot of movement?

Stephen Hurford: Yes. The people who have found us and who “get it” are the SaaS developers. In fact, 90% of our customers are small- to medium-sized customers who are providing SaaS to enterprises and government sectors. It’s starting to be an interesting twist in the tale: these SaaS providers are starting to show enterprises how it’s done. They are the ones figuring out how to offer services. The enterprises are starting to think, “Well, if these SaaS providers can offer services based on AppLogic, why can’t I build my own AppLogic cloud?” That’s become our lead channel into the enterprise market.


DCD: How do service providers deal with disruptive technologies like cloud computing?

Stephen Hurford: From a service provider perspective, it’s simple: first, understand the future before it gets here. Next, push to the front if you can. Then work like crazy to drive it forward.


Cloud is a hugely disruptive technology, but without our low-cost resources we could not be as far forward as we are.


One of the fundamentally revolutionary sides of AppLogic is that I only need thousand-dollar boxes to run this stuff on. And if I need more, I don’t need to buy a $10,000 box. I only need to buy 4 $1,000 boxes. I have a grid and it’s on commodity servers. That is where AppLogic stood out from everything else.


DCD: Can you explain a bit more about the economics of this and your approach to keeping your costs very low? It sounds like a big competitive weapon for you.

Stephen Hurford: One of the big advantages of AppLogic is that it has reduced our hardware stock levels by 75%. That’s because all cloud server nodes are more or less the same, so we can easily reuse a server from one customer to another, simply by reprovisioning it in a new cloud.


One of the key advantages we’ve found is that once hardware has reached the end of its workable life in terms of an enterprise’s standard private cloud, it can very easily be repurposed into our public cloud where there is less question of “well, exactly what hardware is it?” So we’ve found we can extend the workable lifespan of our hardware by 40-50%.


DCD: What types of applications do you see customers bringing to the cloud now? Are they starting with greenfield apps, since this would give them a clean slate? The decision enterprises make will certainly have an impact for service providers.
Stephen Hurford: Some enterprises are taking the approach to ask “OK, how can I do my old stuff on a cloud? How do I do old stuff in a new way?” That’s one option for service providers – can I use the benefits of simplifying and reducing my stock levels, reducing my management overhead from my entire system by moving to AppLogic and then I won’t have 15 different types of servers and 15 different types of management teams. I can centralize it and get immediate benefits from that.


The other approach is to understand that this is a new platform with new capabilities, so I should look at what new stuff can I do with this platform. For service providers, it’s about finding a niche – being able to do something that works for a large proportion of your existing customers. Start there because those are the folks that know you and they know your brand. Think about what they are currently using in the cloud and whether they would rather do that with you.

DCD: Some believe that there is (or will soon be) a mad rush to switch all of IT to cloud computing; others (and I’m one of those) see a much more targeted shift. How do you see the adoption of cloud computing happening?


Stephen Hurford: There will always be customers who need dedicated servers. But those are customers who don’t have unpredictable growth. And need maximum performance. Those customers will get a much lower cost benefit from moving to the cloud.


For example, we were dealing with a company in Texas that wanted to move a gaming platform to the cloud. These are multi-player, shoot-‘em-up games. You host a game server that tracks all the coordinates of every object in space in real-time between 20 and 30 players and sends that data to the Xbox or PlayStation that renders it so they can play in the same gamespace. If you tried to do that with standard commodity hardware, you’re not getting the needed performance on the disk I/O.


The question to that customer was, “Do you have a fixed requirement? If you need 10 servers for a year and you’re not going to need to grow or shrink, don’t move to the cloud.” Dedicated hardware, however, is expensive. They said, “We don’t know what our requirements are and we need to be able to deploy new customers within 10 minutes.” I told them you don’t want to use dedicated servers for that, so you’re back to the cloud, but perhaps with a more tailored hardware solution such as SSD drives to optimize I/O performance.


DCD: So what do you think is the big change in approach here? What’s the change that a cloud platform like what you’re using for your customers is driving?

Stephen Hurford: Customers are saying it’s very easy with our platform to open up 2 browsers and move an entire application and infrastructure stack from New York to Tokyo. But that’s not enough.


Applications need to be nomadic. The concept of nomadic applications is way distant in the future, but for me, what we’re able to offer today is a clear sign-post for the future. Applications with their infrastructure will eventually become completely separated from the hardware level and the hypervisor. My application can go anywhere in the world with its data and with its OS that it needs to run on. All it needs to do is to plug into juice (hardware, CPU, RAM) – and I’ve got what I need.


Workloads like this would be liable to know where they are. By the minute, they’ll be able to find the least-cost service provisioning. If you’ve got an application and it doesn’t really matter where it is and it’s easy to move it around, then you can take advantage of least-cost service provisioning within a wider territorial region.




Thanks, Stephen, for the time and for sharing your perspectives. You can watch Stephen’s video interview and read more about DNS Europe here.

Friday, May 14, 2010

As CA World approaches: a quick guide to CA's cloud acquisitions and some session suggestions

If you’re having trouble sorting through all of CA’s recent moves, I don’t blame you. It’s sort of ironic timing that key components of what we’ve been working on since last summer have been coming together in such a compressed timeframe. The good news is it means that when we get to CA World in a few days, we should have lots of good stuff to show folks, both from the recent acquisitions and from the organic development underway.

I thought I’d highlight the recent activity and point you to a couple good summaries of what CA’s up to so far. And hold on, because there are a few more (OK, lots) interesting things being announced at the conference next week. (No beans are spilled in this post, if that's what you're hoping for.) For those of you who will be in Vegas for CA World, I’ll give you a couple suggestions for presentations you’ll definitely want to hit, from among the 62 cloud-related sessions. (And for the impatient ones in the crowd, the overall cloud and SaaS session guide is here).

Recent cloud-related acquisitions (in chronological order)

They say you can’t tell the players without a program, so here’s a bit of a recap. I’ve linked to my blogs on each of these acquisitions if you want to dig into any one of these in more detail.

· Cassatt (my alma mater) focused on helping enterprises create policy-based private clouds, enabling you to constantly optimize the computing resources you are using based on service levels for a given application. The team joined CA back in June 2009 – and I think nearly all of us are still here.

· Oblicore, acquired in January 2010, is about service level management, with contract management at its center. In the a world in which cloud computing is more and more common as an option, being able to interpret technical data into business metrics is key. Being able to compare those metrics to your actual service level contracts is even more key. Customers of Oblicore include both enterprises and service providers.

· 3Tera gives you a really interesting approach to getting an application to run in a cloud. It encapsulates an app and its supporting infrastructure and lets you move it, scale it, and deploy it as needed. All those things you had to do manually and with physical hardware before (setting up servers, load balancers, firewalls, etc.), you can now connect, add, or remove, but in software form, with a slick GUI. Right now this is interesting for enterprises and very interesting for cloud service providers.

· Nimsoft provides performance and availability monitoring “from the data center to the cloud,” as they say. Rackspace just standardized on it, and the Nimsoft folks, now a stand-alone business unit of CA, just delivered an on-demand version. Their target (unlike the other technologies above) is primarily smaller, emerging enterprises and cloud service providers. Take a look at http://www.unifiedmonitoring.com/ for an idea of how easy it is to get started with this stuff.

More than just putting the pieces together

For a feel for how we are looking to help customers, there are a couple of good, recent write-ups that help explain the way we’re looking at the world. Several folks in the industry press have written up some useful background.

Derrick Harris’s piece for GigaOM put a lot of the pieces together, even though it pre-dated even the 3Tera announcement. Second, Charlie Babcock from InformationWeek did a piece for Plug Into the Cloud following an interview with Chris O’Malley, who is leading the Cloud Business Line at CA. InformationWeek did another piece this week, this time after talking to CEO Bill McCracken that’s also a good summary. It even has a few teasers about next week’s event. Denise Dubie also has been following us pretty closely, too, for Network World and she did a good analysis of CA’s moves. Little did I know she was following us *so* closely that she decided to hop on board here at CA (a pleasant surprise; welcome, Denise!).

The business press also has been reporting on CA’s moves, especially after McCracken talked about the pot of money he was planning to apply to cloud computing acquisitions. Investors Business Daily did a Q&A with O’Malley, outlining the broad issues he sees customers having as they try to move toward running IT more like a supply chain, a metaphor taken from the manufacturing world. Bloomberg drew parallels between what CA's trying to do and what our CA World keynoter James Cameron did with "Avatar" -- to change the game. I don't think we have anything cooking that involves very tall blue aliens. As far as I know, anyway.

CA World suggested sessions

If you’ll be in Vegas and you’re interested in cloud computing and software as a service issues, definitely look at the session listings on the CA World site. Here are some sessions I’m betting will be worth your time:

· CEO Bill McCracken’s keynote on Sunday night and Ajei Gopal’s technology keynote on Monday morning. Those will provide great context for everything else. They should also put CA's cloud computing efforts in context of everything else the company is working on (I’ve heard we do stuff around the mainframe, too).

· “Making the Cloud-Connected Enterprise Real,” Chris O’Malley on Monday at 11 a.m. This is the cloud and SaaS keynote session. I have it on good authority that there will be product news and some cool new technology previewed.

· There are drill-downs on cloud management by David Hodgson and Vince Re, and on IT management from the cloud by Jules Ehrlich on Monday afternoon. Great summaries of the breadth of activity in these areas.

· On Tuesday morning Chris O’Malley is emceeing a Cloud Power Session panel about what computing will look like in 2020 (cloud-based or otherwise, if the cloud term is even still in use then). It features experts from salesforce.com, Microsoft, CSC, Rackspace, and Logicalis. Should be really interesting. So should the virtualization power session Wednesday morning with VMware, Citrix, Microsoft, RedHat, and Cisco. Glenn O’Donnell of Forrester will be in charge of keeping that session under control. Good luck.

· There are a couple Amazon Web Services-related sessions worth noting, including Adam Silipsky’s session on Tuesday morning as part of a special business-themed track and one by Jeff Barr of AWS (along with CA’s Walter Guerrero) about service assurance and EC2.

· Sudhrity Mondal (another former Cassatt guy) is bringing his experience helping customers on cloud implementations both at Cassatt and CA to a session called, “The Process and Pitfalls of Moving to the Cloud: Lessons Learned from Actual Cloud Implementations.” That’s Tuesday afternoon.

I could go on, especially since I helped design the cloud and SaaS tracks...but I won’t. I’ll also restrain myself from highlighting my own Cloud 101 session. After all, if you read this blog, you can probably can skip it. Or even give it.

Some other random items of interest:

· James Cameron’s keynote Monday: that’ll be fun. Oh, and Maroon 5 on Wednesday. They’re no Foreigner, but, hey…

· Meeting a bunch of you in person that I normally only talk to via Twitter. As always, I’m looking forward to that as much as the customer meetings I'm attending. Should be interesting all the way around.

And, yes, there will be sessions focused on the items being announced during the week (at least the cloud-related ones). But, no, I can’t tell you what they are yet. So don’t ask. Besides, Monday’s not that far away.

Tuesday, November 24, 2009

Is cloud a turkey for IT vendors? Thinking through its impact on end user IT spending

Amid preparations for Thanksgiving feasts here in the U.S., a couple recent predictions from IT industry experts could leave vendors with much less to be thankful for this year. I’ve seen more than one pundit say that even though cloud computing is the Next Big Thing for IT, it will, in the end, mean that less money is being spent on IT.

Is cloud computing really going to mean doing more with less?

Matt Asay, for example, recently noted a new Goldman Sachs report called “A Paradigm Shift for IT: The Cloud” found that even though CIOs may be “loosening the purse strings on IT spending, IT vendors may want to hold off their celebrations.” Why? “Much of this spending appears to be headed for deflationary forces like cloud computing [and] virtualization.”

At SYS-CON’s Cloud Computing Expo earlier this month, I heard GoGrid’s CEO John Keagey announce an IT vendor’s moment of triumph and death knell in one breath. “The dream of utility computing has arrived,” he said, followed quickly with: “I believe we’re at the height of IT spending right now. You’ll see the IT economy shrink to half its current size.”

And, finally, during a panel at Interop last week reported by Joe McKendrick, AT&T’s Joe Weinman, raised doubts about the sustainability of cloud computing economics, describing a scenario in which they break down as enterprise management requirements come into play. “I’m not sure there are any unit-cost advantages that are sustainable among large enterprises,” he said. He expects adoption of external cloud computing in some areas, and private capabilities for others.

Now that’s not the kind of market that the purveyors of cloud computing hype were likely hoping for. In fact, it looks to be a bit of a, um, turkey. “An economic rebound never looked so dire,” noted Asay. “That’s unless you’re an IT buyer, of course.”

More on Matt’s last comment in a moment. First, where are these comments coming from?

Goldman Sachs expects a big bump from cloud computing…

The dire remarks Asay picked up are interspersed in that pretty straightforward Goldman Sachs report he mentioned that covers the basics of cloud computing. (In fact, a big section of the report is itself entitled “Cloud Computing 101: definition, drivers and challenges,” so you shouldn’t expect big leaps forward in analysis.)

However, there are a couple of interesting observations from the report about the importance of cloud computing and its relation to the IT spending environment:

· Cloud computing is “similar in importance to the transition from mainframes to client-server computing in the 1980s.”
· The new data center approaches and architectures required for cloud computing “are coming at an auspicious time: they are likely to coincide with a meaningful cyclical upturn in IT spending over the next couple of years.” One underlying reason for the uptick: the underinvestment that the bad economy has caused in the past 18 months or so.
· “Our checks suggest that enterprise IT demand has hit an inflection point and is now beginning to improve.”

Hang on, that all sounded a bit positive, at least to my ears. They expect growth from re-architecting systems “upon a foundation of virtualization and increased automation.” Goldman (and Asay) note the typical (read: big) players that will be taking advantage of that.

So this sounds rosy for 2010 and the years just beyond. Goldman says “we expect a meaningful increase in demand in the technologies that enable virtualization and increase the availability and security of external Clouds.”

…and now the “but”:

· But, longer term, cloud computing could actually “drive some headwinds for the IT industry” because they see virtualization (one of the interesting components that can be involved in cloud computing) as a “deflationary technology.” Goldman sees IT spending concentrating into the hands of fewer buyers: mainly cloud providers, hosting vendors, and large enterprises. They think better utilization will mean less pressure to spend, and when they do, it’ll be a buyer’s market.

For SaaS, however, there was less uncertainty. SaaS offerings were seen to be expanding rapidly and leading other “external” cloud services. In fact, it is now an “unstoppable shift” from on-premise to on-demand delivery. Evangelos Simoudis, managing director of Trident Capital, posted his thoughts from the Goldman event that linked with the report, which underscored their SaaS discussion: “Two years ago,” wrote Simoudis, “the discussion was whether the enterprise will adopt any SaaS applications. Today the conversation is about which on-premise applications will be replaced by their SaaS equivalent.”

As always, baby steps to start

For areas of cloud computing other than SaaS, however, two themes emerged from the Goldman report that matched what other reputable reports have been saying. First, “for large enterprises, Cloud Computing is still very early.” And, second, “most industry participants we spoke with expect large enterprises to take baby steps in their adoption of Cloud Computing resources, especially for business-critical activities.”

Goldman highlights cloud management

Additional comments from Simoudis are relevant here in helping analyze what Goldman’s thoughts might mean. “Infrastructure-as-a-Service (IaaS) requires management skills that most IT shops today don’t have,” he said. As a result, it’s easy to see why Goldman also noted that “systems management [is] likely to become a key component of the platform over time.” Allan Leinwand of Panorama Capital, also speaking during the Interop panel I mentioned previously, underscored this management gap. “There’s a gap between what the enterprise is used to doing behind the firewall and what the cloud providers can do.”

Goldman sees traditional systems management players (BMC and CA being two they call out) going to market with more IT automation capabilities for provisioning workloads in both physical and virtual environments, as well as extending capabilities to the public cloud. But hang on to your hats, they also predict an eventual battle for where these capabilities go between systems management and platform vendors such as VMWare and Microsoft. Something to watch, for sure.

“CIOs don’t want a single, ubiquitous stack for cloud computing,” predicts Simoudis, “but the ability to work simultaneously with multiple stacks coming from different vendors.” CA’s certainly betting on that approach. Goldman’s published take on CA is that it (along with others) has cloud computing as a “potential new growth vector.” That’s a comment I agree with, or (obviously) I wouldn’t still be here.

The opposite of deflationary: Cloud as another channel for growth?

So back to the question. Is IT spending headed for a permanent downshift? As enterprises find their way to the cloud, using baby steps or otherwise, we’ll certainly see if this deflationary impact is for real.

Now, I know that this whole cloud computing revolution is supposed to first and foremost be an economic one. As in, good for the economics of running IT. No one said anything about it being good for the IT vendors. Unless, of course, they figure out how to get on the side of the customers (always a good idea, if you ask me).

And if they do figure out how to side with the customers, maybe the cloud is in fact another channel for growth. Someone recently reminded me how early e-commerce over the Internet looked back in 1999 (Black Friday was much different back then – and there wasn’t even a Cyber Monday). At that time, many analysts predicted the demise of the brick-and-mortar retail store. Instead, the Web has become another channel to sell goods; it has given people another channel to sell and buy.

Is that a good way to look at what will happen with cloud computing as well? Or maybe that’s the wrong metaphor.

Well, it’s at least worth a thought between slices of pumpkin pie. Gobble, gobble.

Thursday, November 19, 2009

Salesforce.com and CA: using Force.com for agile and SaaS-y results

I don't normally spend too much time on application development topics, but CA has a bit of interesting SaaS-related news coming out of Dreamforce today that I thought was worth noting. And when an event bills itself as The Cloud Computing Event of the Year, I guess it's something I should at least mention.

Despite salesforce.com CEO Marc Benioff’s marathon keynote on Wednesday, he – and the crowds – were back for more today (albeit a bit behind schedule). That’s where CA CEO John Swainson joined him onstage to call cloud computing the “most profound change” he’s seen in computing in his 33 years in the business. They also announced a partnership between salesforce.com and CA and a nifty new CA SaaS application built on Force.com before Marc handed John a celebratory birthday bottle of Bordeaux (more on that below).

The product John showed off onstage is CA Agile Planner, a SaaS application for planning, tracking, and managing agile software development projects. Those familiar with the how the agile development approach goes can probably pretty quickly pick up on its usefulness in managing backlogs, sprints, and burn-downs. It can be used standalone or in concert with CA’s Clarity PPM tool (something that can give you visibility and control over all of your organization’s development projects, agile or otherwise).

One of the “big guys” begins to use Force.com, plus collaboration with salesforce.com

CA is one of the first (and I’m guessing the largest) major software vendors to develop apps on Force.com. (See, there are some big guys building apps on Force.com.) Benioff highlighted a number of other apps using Force.com onstage at Dreamforce today as well, but only Swainson went home with a bottle of 1954 Chateau d'Yquem Sauternes.

Why the wine? (Other than to say happy 55th birthday, of course.) Maybe it was to highlight a bit of teamwork that's gone on: CA developed the Agile Planner product in collaboration with salesforce.com itself. The CA product team used salesforce.com’s expertise and best practices from their own transition to agile development for the first version.

A community for feedback

That’s all good, and CA is taking that real-world influence a step further, creating something called the CA Agile Community to go with this product. It’s an on-line community of agile development practitioners who can help shape the feature roadmap for CA Agile Planner. Anyone can join, and folks can share ideas for what should be in the product, vote on things others have suggested, and have discussions with others interested in the agile approach.

The community feedback aspect will mean direct input from users and experts can have direct impact on the product. Unlike a lot of what’s been available on salesforce.com’s platform to date, this gives enterprise-level visibility, but with the easy-to-get-started attributes that come from being on the Force.com platform. A nice combo.

If you want to take a look, you can swing by the CA booth at Dreamforce or go to www.ca.com/agile. The press release on the partnership and the rest of the details of today’s announcement are here.

Now, if we could incorporate a way to vote on when Benioff should give attendees a mid-keynote bio break, I think everyone would be happy.