Author Archive: Matt Chilek

April 4, 2012

Sharing a Heavy Load - New Load Balancer Options

I always think of Ford, Chevy and Toyota pick-up truck commercials when I think of load balancers. The selling points for trucks invariably boil down to performance, towing capacity and torque, and I've noticed that users evaluating IT network load balancers have a similar simplified focus.

The focus is always about high performance, scalability, failover protection and network optimization. When it comes to "performance," users are looking for reliable load balancing techniques — whether it be round robin, least connections, shortest response or persistent IP. Take one of the truck commericals and replace "towing capacity" with "connections per second" and "torque" with "application acceleration" or "SSL offloading," and you've got yourself one heck of a load balancer sales pitch.

SoftLayer's goal has always been to offer a variety of local and global load balancing options, and today, I get to announce that we're broadening that portfolio.

So what's new?

We've added the capability of SSL offloading to our shared load balancers and launched a dedicated load balancer option as well. These new additions to the product portfolio continue our efforts to make life easier on our customers as they build their own fully operational virtual data center.

What's so great about SSL offloading? It accelerates the processing of SSL encrypted websites and makes it easier to manage SSL certificates. Think of this as adding more torque to your environment, speeding up how quickly certs can be decrypted (coming in) and encrypted (heading out).

Up until now, SoftLayer has offered SSL at the server level. This requires multiple SSL certifications for each server or special certs that can be used on multiple servers. With SSL offloading, incoming traffic is decrypted at the load balancer, rather than at the server level, and the load balancer also encrypts outbound traffic. This means traffic is processed in one place — at the load balancer — rather than at multiple server locations sitting behind the load balancer.

With SoftLayer SSL offloading on shared load balancers, customers can start small with few connections and grow on the fly by adding more connections or moving to a dedicated load balancer. This makes it a breeze to deploy, manage, upgrade and scale.

What do the new load balance offerings look like in the product catalog? Here's a breakdown:

Shared Load Balancing
250 Connections with SSL $99.99
500 Connections with SSL $199.99
1000 Connections with SSL $399.99
Dedicated Load Balancer
Standard with SSL $999.00

I'm not sure if load balancing conjures up the same images for you of hauling freight or working on a construction site, but however you think about them, load balancers play an integral part in optimizing IT workloads and network performance ... They're doing the heavy lifting to help get the job done. If you're looking for a dedicated or shared load balancer solution, you know who to call.

-Matt

February 14, 2012

Open Source, OpenStack and SoftLayer

The open-source model has significantly revolutionized not only the IT industry but the business world as well. In fact, it was one of the key "flatteners" Thomas Friedman covered in his tour de force on globalization — The World is Flat. The trend toward collaborating on online projects — including open-source software, blogs, and Wikipedia — remains one of "the most disruptive forces of all."

The success of open-source projects like Linux, Ruby on Rails, and Android reveals the strength and diversity of having developers around the world contributing and providing feedback on code. The community becomes more than the sum of its parts, driving innovation and constant improvement. The case has been made for open source in and of itself, but a debate still rages over the developing case for businesses contributing to open source. Why would a business dedicate resources to the development of something it can't sell?

The answer is simple and straightforward: Contributing to open source fosters a community that can inspire, create and fuel the innovation a business needs to keep providing its customers with even better products. It makes sense ... Having hundreds of developers with different skills and perspectives working on a project can push that project further faster. The end result is a product that benefits the open-source community and the business world. The destiny of the community or the product cannot be defined by a single vendor or business; it's the democratization of technology.

Open-Source Cloud Platforms
Today, there are several open-source cloud platforms vying for industry dominance. SoftLayer has always been a big proponent and supporter of open source, and we've been involved with the OpenStack project from the beginning. In fact, we just announced SoftLayer Object Storage, an offering based on OpenStack Object Storage (code-named Swift). We'll provide code and support for Swift in hopes that it continues to grow and improve. The basic idea behind Swift Object Storage is to create redundant, scalable object storage using clusters of standardized servers to store petabytes of accessible data. I could go on and on about object storage, but I know Marc Jones has a blog specifically about SoftLayer Object Storage being published tomorrow, and I don't want to steal too much of his thunder.

We have to acknowledge and embrace the heterogeneous nature of IT industry. Just as you might use multiple operating systems and hypervisors, we're plan on working with a variety of open-source cloud platforms. Right now, we're looking into supporting initiatives like Eucalyptus, and we have our ear to the street to listen to what our customers are asking for. Our overarching goal is to provide our customers with much-needed technologies that are advancing the hosting industry, and one of the best ways to get to that end is to serve the needs of the open-source community.

As I write this blog post, I can't help but think of it in terms of a the Lord of Rings reference: "One ring to rule them all." The idea that "one ring" is all we need to focus on as a hosting provider just doesn't work when it comes to the open-source community ... It all comes down to enabling choice and flexibility. We'll keep investing in innovation wherever we can, and we'll let the market decide which ring will rule where.

What open-source projects are you working on now? How can SoftLayer get involved?

-Matt

January 20, 2010

Mexican Food vs. On Demand Infrastructure

My friend Ric Moseley has an interesting theory regarding Mexican food. He claims that all Mexican food has the same basic major components, each dish just stacks the components up in different ways. The major components are tortillas, meat and sauce. Of course there are a couple of different ways to prepare each of these components, but in the end, it really boils down to tortillas, meat and sauce. This applies to just about every main-line dish you find on the menu at any number of the local Tex-Mex restaurants. Crispy tacos, soft tacos, enchiladas, tostadas, burritos, fajitas, nachos, quesadillas, flautas, tamales (well almost)... Add more here... I'm going to stop myself before I start sounding like Benjamin "Bubba" Bufford-Blue from Forrest Gump, but you get the idea. By no means am I knocking the combined assembly. Quite the opposite; I'm a huge fan! When it comes down to it, I appreciate the creativity that is involved in putting these ingredients together in such a manner that the finished combination is far greater than the sum of its parts.

And that kind of leads me back to what SoftLayer brings to the table, so to speak. SoftLayer provides all sorts of components for the modern enterprise. Plenty of folks use them as is, heck who doesn't enjoy a warm fresh tortilla with a pad of butter. However, for many people, it’s just an appetizer. The real satisfaction is from the combination of the united components when that steaming plate of enchiladas arrives. One of the great satisfactions of my job is seeing how our customers roll up our components in new and creative ways. The array of application deployments that are hosted by SoftLayer is entirely staggering. Let me throw on my digital chef hat for a minute. Start with a private network database, add public network servers, mix in some cloud computing for quick scalability, and wrap it all in a load balancer. Que bueno! That's some good cooking, and this chef is off to the margarita machine!

June 3, 2009

Microsoft Still Following the Leader with Bing.com Offering

The new search engine “Bing” by the software colossus Microsoft is a sad attempt at capturing some of the search engine traffic that internet superstar Google has dominated for quite some time. Based on the preview video at bing.com, the search engine offers little in new features or innovation, instead catering to the ‘too-lazy-to-click-the-back-button” crowd with expanded link previews from the search results page. I have personally found this type of feature to be near worthless, as information of value is typically more than a few lines from the top. Then again maybe my 5 button mouse has numbed me to the indignation so many users have suffered by having to move the cursor to click the back button after discovering the web page wasn’t quite what they were after. (Google added longer previews in March.)

Microsoft representatives point out the technologic advancement of augmenting the standard fare keyword searches with some semantic based algorithms. This alone should yield significantly better results than the current Microsoft engine, “MSN Live Search.” (Google rolled out its semantic searches months ago.)

Next, Microsoft offers the “Conjecture Circle” to combat Google’s “Wonder Wheel”. OK, I’m just kidding on that one. Besides, it is only June, and Microsoft is still catching up with Google’s March features. They will not be taking on the “Wonder Wheel” until August or September.

I think I see a pattern here! This “innovation” reeks of lag. While taking the conservative copycat approach might be the safe thing for the boys from Redmond, it will never vault them to the front of the line in this market. The turbo boost for technology industries is clearly tied to new ideas and advancement. We see this time and time again as startups bring new whiz-bang tools to market and shoot right past the established giants. Time will of course tell. Fortunately in the fast paced world of the internet, we will not have to wait long it see if Bing will go bang.

Categories: 
September 3, 2008

IPv4 vs. Big Oil

Everyone is complaining about the price of gas at the pump. It’s a plain fact that it cost more than it used to fill up. Why is that? If you picked a handful of economists at random you will likely get a different story from each of them. One often mentioned of late is the oil speculators market. Not being a business guy, I hadn’t really ever paid attention to the oil futures market; much less the futures market in general. The speculation on oil prices got me thinking. Why do people think oil is going to go up in the future?

Most likely because it is a finite resource, and at some point it will become unobtainable through reasonable means. I personally think that the advances in technology will keep the black gold flowing for quite a while, but I am no where near naïve enough to believe that an infinite amount of oil can be contained within the finite confines of the globe we call Earth. Still, there is enough out there either undiscovered or untapped to keep our civilization plugging along well after Al Gore has melted all the ice caps with his private jet.

This led me to consider the impending depletion of the IPv4 address pool. Unlike the supply of magical natural resource oil, the available IPv4 address space cannot be augmented by new technology. There are no hidden underground caches to be found. It’s not like an expedition of the coast of Chile will stick a pipe in the ground and IP addresses will start spewing out. For IPv4, what you see is what you get, and what I see is the last 20% of a shrinking pool.

In theory, the answer is easy. Everyone just needs to jump on the new IPv6 train instead of riding around in their old fashioned IPv4 cars. The practicality of that solution is not quite that simple. That fancy IPv6 train is very limited right now. It currently requires special tracks, and they only go certain places, none of which is grandma’s house. Ultimately, user demand will force local ISPs to start supporting IPv6. In the great dance known as capitalism, they ISPs will bow to user demand and provide this service. However, between now and that future lies a pinch.

It’s that last squeeze of toothpaste before you have to run to the store and get another tube. The hosting industry, being the most voracious of IPv4 address consumers, is actively working towards IPv6 deployment. The real question is how long until the home ISPs start supporting it. All the address space in the world doesn’t help if the consumers can’t browse there. And to that end, doesn’t all that legacy IPv4 address space become a precious commodity? In the not so distant future, is there a speculative market for IPv4 real-estate? I see it as a real possibility. I just wouldn’t want to be the one owning that venture when the last telecom announces IPv6 support.

-Matt

Categories: 
Subscribe to Author Archive: %