Posts Tagged 'Gartner'

August 4, 2016

Magic Quadrants, Performance Metrics & Water Cooler Discussions: Evaluating Cloud IaaS

When you make decisions about extending your infrastructure footprint into the cloud, you do so very intentionally. You hunt down analyst reports, ask peers for recommendations, and seek out quantitative research to compare the seemingly endless array of cloud-based options. But how can you be sure that you’re getting the most relevant information for your business case? Bias exists and definitions matter. So each perspective is really just a single input in the decision-making process.

The best process for evaluating any cloud solution involves four simple steps:

  1. Understand what you need.
  2. Understand what you’re buying.
  3. Understand how you’ll use it.
  4. Test it yourself.

Understand What You Need

The first step in approaching cloud adoption is to understand the resources your business actually needs. Are you looking to supplement your on-premises infrastructure with raw compute and storage power? Do your developers just need runtimes and turnkey services? Would you prefer infrastructure-abstracted software functionality?

In the past, your answers to those questions may send you to three different cloud providers, but the times are changing. The lines between “Infrastructure as a Service,” “Platform as a Service,” and “Software as a Service” have blurred, and many cloud providers are delivering those offerings side-by-side. While SoftLayer cloud resources would be considered “infrastructure,” SoftLayer is only part of the broader IBM Cloud story.

Within the IBM Cloud portfolio, customers find IaaS, PaaS, and SaaS solutions to meet their unique workload demands. From an infrastructure perspective alone, IBM Cloud offers cloud servers and storage from SoftLayer; containers, databases, deployment, and monitoring tools within Bluemix; and turnkey OpenStack private cloud environments from Blue Box. We are integrating every component of the IBM Cloud portfolio into a seamless user experience so that when a customer needs to add cognitive capabilities or a private cloud or video services to their bare metal server infrastructure, the process is quick and easy.

Any evaluation of SoftLayer as a cloud provider would be shortsighted if it doesn’t take into account the full context of how IBM Cloud is bringing together multiple unique, highly differentiated offerings to provide a dynamic, full-featured portfolio of tools and services in the cloud. And as you determine what you need in the cloud, you should look for a provider that enables the same kind of cross-functional flexibility so that you don’t end up splintering your IT environment across multiple providers.

Understand What You’re Buying

Let’s assume that you’re primarily interested in deploying raw compute infrastructure in the cloud, since that’s SoftLayer’s primary focus. The seemingly simple first step in choosing the cloud infrastructure that best meets your needs is to define what “cloud infrastructure” actually means for your business.

Technology analyst firm Gartner defines cloud IaaS as “a standardized, highly automated offering, where compute resources, complemented by storage and networking capabilities, are owned by a service provider and offered to the customer on demand. The resources are scalable and elastic in near real time, and metered by use.” While that definition seems broad, its Magic Quadrant for Cloud Infrastructure as a Service explains that when cloud resources are provisioned in “near real time,” that means they must be deployed in minutes (not hours). To be considered “metered by use,” they must be charged by the minute or hour (rather than by the month).

Given Gartner’s interpretation of “real time” and the “by use” measurement, bare metal servers that are fully configured by the customer and provisioned into a cloud provider’s data center (usually in about two hours and billed by the month) aren’t classified as cloud infrastructure as a service. That distinction is important, because many customers looking to extend workloads into the cloud are more interested in the performance of the resources than they are in provisioning times, and bare metal servers deliver better, more consistent performance than their virtualized counterparts.

The performance angle is important. Many of cloud customers need servers capable of processing large, big data workloads (data mining, numerical and seismic analysis, processing and rendering 3D video, real-time social media analysis, etc.). These types of workloads generally consist of petabytes of data, and bare metal servers are better suited for running them—and options like adding GPU cards for high performance computing make them even more enticing. The fact is that most virtualized cloud servers that can be delivered in minutes or less are not capable of handling these types of demanding workloads at all, or at least not as well as, more powerful bare metal servers that are available in just a couple of hours.

In contrast to Gartner’s definition, other analysts support the inclusion of monthly bare metal servers in cloud infrastructure decisions. In “The Truth About Price-Performance,” Frost & Sullivan explains, “Bare metal servers provide the highest levels of raw ‘throughput’ for high-performance workloads, as well as flexibility to configure storage and network resources.” And Forrester Research published a full report to address the question, “Is bare metal really ‘cloud’?” The answer was, again, a resounding yes.

Using Gartner’s definition, the majority of SoftLayer’s cloud infrastructure as a service offerings are considered “noncloud,” so they are not considered or measured in evaluations like the Magic Quadrant for Cloud IaaS. And without the majority of our business represented, the interpretation of those results may be confusing.

In practice, customers actually choose SoftLayer because of the availability of the offerings that Gartner considers to be “noncloud.” For example, Clicktale, a SoftLayer client, explains, “SoftLayer gives us the flexibility we need for demanding workloads. The amount of data we process is enormous, but SoftLayer’s bare metal machines are the best out there and we have a high level of control over them—it’s like owning them ourselves.”  

Our unique cloud platform with full support of both bare metal servers and virtual servers delivers compute resources that better suit our customers’ workloads in the cloud. Whether or not you consider those resources “cloud” is up to you, but if you opt for a more limited definition, you’ll cut out a large, important segment of the cloud market.

Understand How You’ll Use It

Once you settle on a definition of what meets your workload’s needs in the cloud, it’s important to evaluate how a given cloud resource will actually be used. Many of the factors that go into this evaluation are actually supplementary to the resource itself. Is it accessible via API? How can you connect it to your on-premises infrastructure? Will the data and workloads hosted on these resources be delivered quickly and consistently when your customers or internal teams need them?

While some of these questions are relatively easy to answer, others are nuanced. For example, SoftLayer's data center footprint continues to expand around the world, but this seemingly pedestrian process of making servers available in a new facility or geography is only part of the story. Because every new SoftLayer data center is connected to a single global network backbone that streamlines and accelerates data transfer to, from, and between servers, as our data center footprint grows, our network performance improves to and from users in that geography to SoftLayer customer servers in every other data center around the world.

And what does that underlying network architecture mean in practice? Well, we’ve run public network performance tests that show consistent results between 35 percent to 700 percent faster network speeds when compared to other “leaders” in the cloud space. Most industry reports, including Gartner’s Magic Quadrant for Cloud Infrastructure as a Service, fail to acknowledge the importance of network performance in their assessments of cloud resources, focusing instead on the features and functionality of a given offering on its own.

The underlying platform capabilities and network infrastructure that support a given cloud resource aren’t obvious when comparing the speeds and feeds of cloud server specifications. So as you evaluate a cloud provider, it’s important to look beyond “what’s in the box” to how cloud resources will actually perform, both on the server and between the server and your data’s users. And the best way to get an understanding of that performance is to run your own tests.

Test It Yourself

The process of choosing a cloud provider or adopting a specific cloud resource cannot be purely academic. The nature of cloud computing allows for on-demand deployment of resources for real-world testing at a low cost with no long-term commitments. Making a decision to go with a given cloud provider or resource based on what anyone says—be it Gartner’s MQ, Forrester, Frost & Sullivan, SoftLayer, or your nephew—could have huge implications on your business.

SoftLayer will continue working with third-party research firms to demonstrate how our cloud infrastructure delivers up to 440 percent better performance for the cost compared with our competitors, but those stats are meant to start a conversation, not end it.

We encourage prospective customers to try SoftLayer for free. You can do this by taking advantage of up to $500 in free cloud resources for a month. Put our servers and our underlying platform to the test. Then make your own assessments on the vision and execution of SoftLayer’s unique approach to cloud infrastructure as a service.

Start Building
October 13, 2015

The SLayer Standard Vol. 1, No. 16

The week in review. All the IBM Cloud and SoftLayer headlines in one place.

The dawn of a new era
IBM’s fearless leader, Ginni Rometty, took the stage at the Gartner Symposium last week to discuss the where Big Blue is headed.

What’s in store for IBM? Rometty called it the “cognitive era.” In that vein, IBM is forming the Cognitive Business Solutions group “to build cognitive innovations across industries.” About the new addition, Rometty said, “Digital business and digital intelligence equal the cognitive era.”

Pretty exciting stuff, if we do say so ourselves. Read more about it here.

Tackling mobile cloud security with AT&T
IBM and AT&T are working together on a mobile cloud security solution to improve mobile app and data security in the cloud.

How are they going to do it? Caleb Barlow, vice president of IBM Security, said, "To help protect organizations, employees, and data, IBM Security and AT&T are delivering a tested and easy-to-deploy set of complimentary tools. We’re giving enterprise mobile device users stable, private access to data and apps in the cloud.” The approach will allow workforces to be “productive without compromising security and the mobile user experience.”

What are these tools Barlow references? Learn about the partnership here.

Tangled up in Big Blue
The world’s most famous supercomputer is following Rometty’s lead—Watson’s all about cognitive computing, too. Except Watson has Bob Dylan in his corner of the ring.

Wait, what? Bob Dylan? Yes, you read that correctly. The Jeopardy winner jokes with Dylan in a new IBM video.


September 24, 2012

Cloud Computing is not a 'Thing' ... It's a way of Doing Things.

I like to think that we are beyond 'defining' cloud, but what I find in reality is that we still argue over basics. I have conversations in which people still delineate things like "hosting" from "cloud computing" based degrees of single-tenancy. Now I'm a stickler for definitions just like the next pedantic software-religious guy, but when it comes to arguing minutiae about cloud computing, it's easy to lose the forest for the trees. Instead of discussing underlying infrastructure and comparing hypervisors, we'll look at two well-cited definitions of cloud computing that may help us unify our understanding of the model.

I use the word "model" intentionally there because it's important to note that cloud computing is not a "thing" or a "product." It's a way of doing business. It's an operations model that is changing the fundamental economics of writing and deploying software applications. It's not about a strict definition of some underlying service provider architecture or whether multi-tenancy is at the data center edge, the server or the core. It's about enabling new technology to be tested and fail or succeed in blazing calendar time and being able to support super-fast growth and scale with little planning. Let's try to keep that in mind as we look at how NIST and Gartner define cloud computing.

The National Institute of Standards and Technology (NIST) is a government organization that develops standards, guidelines and minimum requirements as needed by industry or government programs. Given the confusion in the marketplace, there's a huge "need" for a simple, consistent definition of cloud computing, so NIST had a pretty high profile topic on its hands. Their resulting Cloud Computing Definition describes five essential characteristics of cloud computing, three service models, and four deployment models. Let's table the service models and deployment models for now and look at the five essential characteristics of cloud computing. I'll summarize them here; follow the link if you want more context or detail on these points:

  • On-Demand Self Service: A user can automatically provision compute without human interaction.
  • Broad Network Access: Capabilities are available over the network.
  • Resource Pooling: Computing resources are pooled to serve multiple consumers using a multi-tenant model, with different physical and virtual resources dynamically assigned and reassigned.
  • Rapid Elasticity: Capabilities can be elastically provisioned and released.
  • Measured Service: Resource usage can be monitored, controlled and reported.

The characteristics NIST uses to define cloud computing are pretty straightforward, but they are still a little ambiguous: How quickly does an environment have to be provisioned for it to be considered "on-demand?" If "broad network access" could just mean "connected to the Internet," why include that as a characteristic? When it comes to "measured service," how granular does the resource monitoring and control need to be for something to be considered "cloud computing?" A year? A minute? These characteristics cast a broad net, and we can build on that foundation as we set out to create a more focused definition.

For our next stop, let's look at Gartner's view: "A style of computing in which scalable and elastic IT-enabled capabilities are delivered as a service using Internet infrastructure." From a philosophical perspective, I love their use of "style" when talking about cloud computing. Little differentiates the underlying IT capabilities of cloud computing from other types of computing, so when looking at cloud computing, we really just see a variation on how those capabilities are being leveraged. It's important to note that Gartner's definition includes "elastic" alongside "scalable" ... Cloud computing gets the most press for being able to scale remarkably, but the flip-side of that expansion is that it also needs to contract on-demand.

All of this describes a way of deploying compute power that is completely different than the way we did this in the decades that we've been writing software. It used to take months to get funding and order the hardware to deploy an application. That's a lot of time and risk that startups and enterprises alike can erase from their business plans.

How do we wrap all of those characteristics up into unified of definition of cloud computing? The way I look at it, cloud computing is as an operations model that yields seemingly unlimited compute power when you need it. It enables (scalable and elastic) capacity as you need it, and that capacity's pricing is based on consumption. That doesn't mean a provider should charge by the compute cycle, generator fan RPM or some other arcane measurement of usage ... It means that a customer should understand the resources that are being invoiced, and he/she should have the power to change those resources as needed. A cloud computing environment has to have self-service provisioning that doesn't require manual intervention from the provider, and I'd even push that requirement a little further: A cloud computing environment should have API accessibility so a customer doesn't even have to manually intervene in the provisioning process (The customer's app could use automated logic and API calls to scale infrastructure up or down based on resource usage).

I had the opportunity to speak at Cloud Connect Chicago, and I shared SoftLayer's approach to cloud computing and how it has evolved into a few distinct products that speak directly to our customers' needs:

The session was about 45 minutes, so the video above has been slimmed down a bit for easier consumption. If you're interested in seeing the full session and getting into a little more detail, we've uploaded an un-cut version here.


August 22, 2007

SaaS Who?

I'm always on the lookout for drivers of the hosting industry. One of those drivers is "Software as a Service" (SaaS). After seeing this article where Gartner thinks that the SaaS market will basically triple over the next four years, I wondered if the SaaS movement was affecting me yet.

I then realized that the SaaS market had grabbed me in one very important area – personal finance software. I had become disenchanted with my common desktop personal finance package – it had a lot of features that I didn't use and it didn't do some things that I need. So I began searching for options and discovered Mvelopes.

Mvelopes is a SaaS solution. Rather than buy the software, you subscribe to it over the Web. Anywhere you have an Internet connection you can access it, even from a mobile phone. It logs into the web sites of your bank accounts, investment accounts, retirement accounts, credit card accounts, etc., and brings your account info together in a convenient one-screen view. It will download info from accounts that your old desktop software can’t touch. And because it’s a subscription, you never have to worry about keeping up with new versions, patches, etc. Every time you log on you have the latest production software updates at your disposal.

It doesn't do asset allocation, portfolio analysis, technical analysis, stock screens or other fancy things that desktop personal finance software attempts to do. But it does planning and budgeting VERY well. This is what I need my personal financial software to do most of all, and my old software didn’t do this very well.

Probably the biggest hurdle in latching on to a SaaS solution like this is getting comfortable with placing the data security outside of your control since it doesn't reside on your local machine. But when you realize that the vendor hosts the data at a secure data center and has far better data security, physical security, and network security than your spare upstairs bedroom office, it is possible to make the move.

According to Gartner, software consumers will quickly realize the simplicity of subscribing to secure hosted solutions that are accessible from anywhere. Naturally, these rapidly growing SaaS solutions need a home. Consequently, we at SoftLayer would like to ask these SaaS providers such as Mvelopes, "Who does your hosting?"


Subscribe to gartner