Posts Tagged 'Programming'

December 8, 2011

UNIX Sysadmin Boot Camp: bash - Keyboard Shortcuts

On the support team, we're jumping in and out of shells constantly. At any time during my work day, I'll see at least four instances of PuTTY in my task bar, so one thing I learned quickly was that efficiency and accuracy in accessing ultimately make life easier for our customers and for us as well. Spending too much time rewriting paths, commands, VI navigation, and history cycling can really bring you to a crawl. So now that you have had some time to study bash and practice a little, I thought I'd share some of the keyboard shortcuts that help us work as effectively and as expediently as we do. I won't be able to cover all of the shortcuts, but these are the ones I use most:

Tab

[Tab] is one of the first keyboard shortcuts that most people learn, and it's ever-so-convenient. Let's say you just downloaded pckg54andahalf-5.2.17-v54-2-x86-686-Debian.tar.gz, but a quick listing of the directory shows you ALSO downloaded 5.1.11, 4.8.6 and 1.2.3 at some point in the past. What was that file name again? Fret not. You know you downloaded 5.2.something, so you just start with, say, pckg, and hit [Tab]. This autocompletes everything that it can match to a unique file name, so if there are no other files that start with "pckg," it will populate the whole file name (and this can occur at any point in a command).

In this case, we've got four different files that are similar:
pckg54andahalf-5.2.17-v54-2-x86-686-Debian.tar.gz pckg54andahalf-5.1.11-v54-2-x86-686-Debian.tar.gz
pckg54andahalf-4.8.6-v54-2-x86-686-Debian.tar.gz
pckg54andahalf-1.2.3-v54-2-x86-686-Debian.tar.gz

So typing "pckg" and hitting [Tab] brings up:
pckg54andahalf-

NOW, what you could do, knowing what files are there already, is type "5.2" and hit [Tab] again to fill out the rest. However, if you didn't know what the potential matches were, you could double-tap [Tab]. This displays all matching file names with that string.

Another fun fact: This trick also works in Windows. ;)

CTRL+R

[CTRL+R] is a very underrated shortcut in my humble opinion. When you've been working in the shell for untold hours parsing logs, moving files and editing configs, your bash history can get pretty immense. Often you'll come across a situation where you want to reproduce a command or series of commands that were run regarding a specific file or circumstance. You could type "history" and pore through the commands line by line, but I propose something more efficient: a reverse search.

Example: I've just hopped on my system and discovered that my SVN server isn't doing what it's supposed to. I want to take a look at any SVN related commands that were executed from bash, so I can make sure there were no errors. I'd simply hit [CTRL+R], which would pull up the following prompt:

(reverse-i-search)`':

Typing "s" at this point would immediately return the first command with the letter "s" in it in the history ... Keep in mind that's not just starting with s, it's containing an s. Finishing that out to "svn" brings up any command executed with those letters in that order. Pressing [CTRL+R] again at this point will cycle through the commands one by one.

In the search, I find the command that was run incorrectly ... There was a typo in it. I can edit the command within the search prompt before hitting enter and committing it to the command prompt. Pretty handy, right? This can quickly become one of your most used shortcuts.

CTRL+W & CTRL+Y

This pair of shortcuts is the one I find myself using the most. [CTRL+W] will basically take the word before your cursor and "cut" it, just like you would with [CTRL+X] in Windows if you highlighted a word. A "word" doesn't really describe what it cuts in bash, though ... It uses whitespace as a delimiter, so if you have an ultra long file path that you'll probably be using multiple times down the road, you can [CTRL+W] that sucker and keep it stowed away.

Example: I'm typing nano /etc/httpd/conf/httpd.conf (Related: The redundancy of this path always irked me just a little).
Before hitting [ENTER] I tap [CTRL+W], which chops that path right back out and stores it to memory. Because I want to run that command right now as well, I hit [CTRL+Y] to paste it back into the line. When I'm done with that and I'm out referencing other logs or doing work on other files and need to come back to it, I can simply type "nano " and hit [CTRL+Y] to go right back into that file.

CTRL+C

For the sake of covering most of my bases, I want to make sure that [CTRL+C] is covered. Not only is it useful, but it's absolutely essential for standard shell usage. This little shortcut performs the most invaluable act of killing whatever process you were running at that point. This can go for most anything, aside from the programs that have their own interfaces and kill commands (vi, nano, etc). If you start something, there's a pretty good chance you're going to want to stop it eventually.

I should be clear that this will terminate a process unless that process is otherwise instructed to trap [CTRL+C] and perform a different function. If you're compiling something or running a database command, generally you won't want to use this shortcut unless you know what you're doing. But, when it comes to everyday usage such as running a "top" and then quitting, it's essential.

Repeating a Command

There are four simple ways you can easily repeat a command with a keyboard shortcut, so I thought I'd run through them here before wrapping up:

  1. The [UP] arrow will display the previously executed command.
  2. [CTRL+P] will do the exact same thing as the [UP] arrow.
  3. Typing "!!" and hitting [Enter] will execute the previous command. Note that this actually runs it. The previous two options only display the command, giving you the option to hit [ENTER].
  4. Typing "!-1" will do the same thing as "!!", though I want to point out how it does this: When you type "history", you see a numbered list of commands executed in the past -1 being the most recent. What "!-1" does is instructs the shell to execute (!) the first item on the history (-1). This same concept can be applied for any command in the history at all ... This can be useful for scripting.

Start Practicing

What it really comes down to is finding what works for you and what suits your work style. There are a number of other shortcuts that are definitely worthwhile to take a look at. There are plenty of cheat sheets on the internet available to print out while you're learning, and I'd highly recommend checking them out. Trust me on this: You'll never regret honing your mastery of bash shortcuts, particularly once you've seen the lightning speed at which you start flying through the command line. The tedium goes away, and the shell becomes a much more friendly, dare I say inviting, place to be.

Quick reference for these shortcuts:

  • [TAB] - Autocomplete to furthest point in a unique matching file name or path.
  • [CTRL+R] - Reverse search through your bash history
  • [CTRL+W] - Cut one "word" back, or until whitespace encountered.
  • [CTRL+Y] - Paste a previously cut string
  • [CTRL+P] - Display previously run command
  • [UP] - Display previously run command

-Ryan

August 23, 2011

SOAP API Application Development 101

Simple Object Access Protocol (SOAP) is built on server-to-server remote procedure calls over HTTP. The data is formatted as XML; this means secure, well formatted data will be sent and received from SoftLayer's API. This may take a little more time to set up than the REST API but it can be more scalable as you programmatically interface with it. SOAP's ability to tunnel through existing protocols such as HTTP and innate ability to work in an object-oriented structure make it an excellent choice for interaction with the SoftLayer API.

This post gets pretty technical and detailed, so it might not appeal to our entire audience. If you've always wondered how to get started with SOAP API development, this post might be a good jumping-off point.

Authentication
Before you start playing with the SoftLayer SOAP API, you will need to find your API authentication token. Go into your portal account, and click the "Manage API Access" link from the API page under the Support tab. At the bottom of the page you'll see a drop down menu for you to "Generate a new API access key" for a user. After you select a user and click the "Generate API Key" button, you will see your username and your API key. Copy this API key, as you'll need it to send commands to SoftLayer's API.

PHP
In PHP 5.0+ there are built in classes to deal with SOAP calls. This allows us to quickly create an object oriented, server side application for handling SOAP requests to SoftLayer's API. This tutorial is going to focus on PHP 5.1+ as the server side language for making SOAP function calls. If you haven’t already, you will need to install the soap client for php, here is a link with directions.

Model View Controller

Model-View-Controller or MVC is a software architecture commonly used in web development. This architecture simply provides separation between a data abstraction layer (model), the business logic (controller), and the resulting output and user interface (view). Below, I will describe each part of our MVC "hello world" web application and dissect the code so that you can understand each line.

To keep this entry a little smaller, the code snippits I reference will be posted on their own page: SOAP API Code Examples. Protip: Open the code snippit page in another window so you can seamlessly jump between this page and the code it's referencing.

Model
The first entry on the API Code Examples page is "The Call Class," a custom class for making basic SOAP calls to SoftLayer's API. This class represents our model: The SOAP API Call. When building a model, you need to think about what properties that model has, for instance, a model of a person might have the properties: first name, height, weight, etc. Once you have properties, you need to create methods that use those properties.

Methods are verbs; they describe what a model can do. Our "person" model might have the methods: run, walk, stand, etc. Models need to be self-sustaining, that means we need to be able to set and get a property from multiple places without them getting jumbled up, so each model will have a "set" and "get" method for each of its properties. A model is a template for an object, and when you store a model in a variable you are instantiating an instance of that model, and the variable is the instantiated object.

  • Properties and Permissions
    Our model has these properties: username, password (apiKey), service, method, initialization parameters, the service's WSDL, SoftLayer's type namespace, the SOAP API client object, options for instantiating that client, and a response value. The SOAP API client object is built into php 5.1+ (take a look at the “PHP” section above), as such, our model will instantiate a SOAP API object and use it to communicate to SoftLayer's SOAP API.

    Each of our methods and properties are declared with certain permissions (protected, private, or public), these set whether or not outside functions or extended classes can have access to these properties or methods. I "set" things using the "$this" variable, $this represents the immediate class that the method belongs to. I also use the arrow operator (->), which accesses a property or method (to the right of the arrow) that belongs to $this (or anything to the left of the arrow). I gave as many of the properties default values as I could, this way when we instantiate our model we have a fully fleshed out object without much work, this comes in handy if you are instantiating many different objects at once.

  • Methods
    I like to separate my methods into 4 different groups: Constructors, Actions, Sets, and Gets:
    • Sets and Gets
      Sets and Gets simply provide a place within the model to set and get properties of that model. This is a standard of object oriented programing and provides the model with a good bit of scalability. Rather than accessing the property itself, always refer to the function that gets or sets the property. This can prevent you from accidentally changing value of the property when you are trying to access it. Lines 99 to the end of our call are where the sets and gets are located.

    • Constructors
      Constructors are methods dedicated to setting options in the model, lines 23-62 of the call model are our constructors. The beauty of these three functions is that they can be copied into any model to perform the same function, just make sure you keep to the Zend coding standards.

      First, let’s take a look at the __construct method on line 24. This is a special magic php method that always runs immediately when the model is instantiated. We don’t want to actually process anything in this method because if we want to use the default object we will not be passing any options to it, and unnecessary processing will slow response times. We pass the options in an array called Setup, notice that I am using type hinting and default parameters when declaring the function, this way I don’t have to pass anything to model when instantiating. If values were passed in the $Setup variable (which must be an array), then we will run the “setOptions” method.

      Now take a look at the setOptions method on line 31. This method will search the model for a set method which matches the option passed in the $setup variable using the built in get_class_methods function. It then passes the value and name of that option to another magic method, the __set method.

      Finally, let’s take a look at the __set and __get methods on lines 45 and 54. These methods are used to create a kind of shorthand access to properties within the model, this is called overloading. Overloading allows the controller to access properties quicker and more efficiently.

    • Actions
      Actions are the traditional verbs that I mentioned earlier; they are the “run”, “walk”, “jump”, and “climb” of our person model. We have 2 actions in our model, the response action and the createHeaders action.

      The createHeaders action creates the SOAP headers that we will pass to the SoftLayer API; this is the most complicated method in the model. Understanding how SOAP is formed and how to get the correct output from php is the key to access SoftLayer’s API. On line 77, you will see an array called Headers, this will store the headers that we are about to make so that we can easily pass them along to the API Client.

      First we will need to create the initial headers to communicate with SoftLayer’s API. This is what they should look like:

      <authenticate xsi:type="slt:authenticate" xmlns:slt="http://api.service.softlayer.com/soap/v3/SLTypes/">
          <username xsi:type="xsd:string">MY_USERNAME</username>
          <apiKey xsi:type="xsd:string">MY_API_ACCESS_KEY</apiKey>
      </authenticate>
      <SoftLayer_API_METHODInitParameters xsi:type="v3:SoftLayer_API_METHODInitParameters" >
          <id xsi:type="xsd:int">INIT_PERAMETER</id>
      </SoftLayer_API_METHODInitParameters>

      In order to build this we will need a few saved properties from our instantiated object: our api username, api key, the service, initialization parameters, and the SoftLayer API type namespace. The api username and key will need to be set by the controller, or you can add in yours to the model to use as a default. I will store mine in a separate file and include it in the controller, but on a production server you might want to store this info in a database and create a "user" model.

      First, we instantiate SoapVar objects for each authentication node that we need. Then we store the SoapVar objects in an array and create a new SoapVar object for the "authenticate" node. The data for the "authenticate" node is the array, and the encoding is type SOAP_ENC_OBJECT. Understanding how to nest SoapVar objects is the key to creating well formed SOAP in PHP. Finally, we instantiate a new SoapHeader object and append that to the Headers array. The second header we create and add to the Headers array is for initialization parameters. These are needed to run certain methods within SoftLayer’s API; they essentially identify objects within your account. The final command in this method (__setSoapHeaders) is the magical PHP method that saves the headers into our SoapClient object. Now take a look at how I access the method; because I have stored the SoapClient object as a property of the current class I can use the arrow operator to access methods of that class through the $_client property of our class, or the getClient() method of our class which returns the client.

      The Response method is the action which actually contacts SoftLayer’s API and sends our SOAP request. Take a look at how I tell PHP that the string stored in our $_method property is actually a method of our $_client property by adding parenthesis to the end of the $Method variable on line 71.

View
The view is what the user interprets, this is where we present our information and create a basic layout for the web page. Take a look at "The View" section on SOAP API Code Examples. Here I create a basic webpage layout, display output information from the controller, and create a form for sending requests to the controller. Notice that the View is a mixture of HTML and PHP, so make sure to name it view.php that way the server knows to process the php before sending it to the client.

Controller
The controller separates user interaction from business logic. It accepts information from the user and formats it for the model. It also receives information from the model and sends it to the view. Take a look at "The Controller" section on SOAP API Code Examples. I accept variables posted from the view and store them in an array to send to the model on lines 6-11. I then instantiate the $Call object with the parameters specified in the $Setup array, and store the response from the Response method as $Result in line 17 for use by the view.

Have Fun!
Although this tutorial seems to cover many different things, this just opens up the basic utilities of SoftLayer's API. You should now have a working View to enter information and see what kind of data you will receive. The first service and method you should try is the SoftLayer_Account service and the getObject method. This will return your account information. Then try the SoftLayer_Account service and the getHardware method; it will return all of the information for all of your servers. Take the IDs from those servers and try out the SoftLayer_Hardware_Server service and the getObject method with that id as the Init property.

More examples to try: SoftLayer Account, SoftLayer DNS Domain, SoftLayer Hardware Server. Once you get the hang of it, try adding Object Masks and Result Limits to your model.

Have Fun!

-Kevin

February 14, 2011

The Black Cat

"Dogma." "Religion." What comes to mind when you hear these words? In the real world, you might think of Christianity, Islam, or Judaism. In the political world, you might think of Communism vs. Freedom. Closer to home, you might think of "red state" and "blue state."

Computers are deterministic, logical machines, yet they too have all the trappings of the world's major religions and dogmas. The desktop world is dominated by Microsoft and Apple to use the religion metaphor. All computing worlds could be broken down into "proprietary" and "open source" if we are talking about dogma.

Relevant to this discussion, the web development world has three major religions in those two dogmas: Microsoft's ASP.NET, the PHP world, and the Java world. My platform of choice has always been ASP.NET.

I am pretty solid in my reasons for preferring it over all others, and also pretty clear about the accidental reasons I found myself in this camp. Much how someone born into a particular religion is likely to freely adopt it at some point, I too ended up adopting ASP.NET for reasons that were nothing more than accidental.

I consider myself a 'citizen of the world' in more ways than one, and the opportunity to work at SoftLayer was an opportunity I couldn't turn down. I had to check my biases at the door, open my mind, and see how this side of the aisle does business. (And as if to remind me that a dogmatic shift has occurred in my professional life, Fox News continues to greet me every morning at the top of the stairs.)

To admit just some of my biases: How on earth did you build an enterprise-grade portal with a weakly typed language that doesn't require something as basic as a compiler? More importantly: Why? How does one work with such a thing? Some of you still use VI?? Seriously?

Fast forward about six months — just enough to say I am "proficient" in PHP and to have an exposure to the database side of things. The journey and rants are long and technical, but it should come as no surprise that I still prefer the Microsoft ecosystem over one based on PHP. I find it easier to work with, faster, and less error-prone than the alternative. The language is more structured, the tooling is better, and the framework better established and developer-oriented.

Humor me with this for a moment.

Assume for the sake of argument that my belief is correct — that Microsoft's offerings are indeed better than PHP's on every metric a developer can measure. If this is true, one might reasonably conclude that SoftLayer erred in its choice of development platform. Even though I will be the first to evangelize the virtues of the Microsoft ecosystem, I'll also be the first to say that this conclusion is wrong.

The conclusion is wrong because in asking the "Why?" in "Why SoftLayer chose the platform it did," I approached the question from the perspective of what's best for the developer. The question should have instead been phrased as: "What does SoftLayer's choice of platform say about our core values?"

It isn't exactly open source. Place the source code on any laptop, and you'll get the modern-day equivalent of summary execution: You will be fired.

It isn't developer convenience. It isn't needed. From what I've seen here, the developers have used their tools in a more extensive and architecturally correct way than I have in my time in the ASP.NET ecosystem.

The elusive answer can be summed up in one word: Independence. Fierce independence if you're into using superlatives.

While the Microsoft ecosystem may be the easiest on developers, it comes at price. Microsoft's ultimate responsibility is to the thousands of people that use its tools, so it has to steer its platforms in a way that fit the disparate needs of the many developers who rely on them. In relying on its own software, built on open-source offerings, SoftLayer can steer its platform in a way that benefits SoftLayer ... It has only its own needs to consider.

The soundness of this reality — and indeed, the necessity of being fully independent when one's core offering is the basic infrastructure that runs people's businesses should be obvious.

Very often we become overprotective of our dogmas, and fear that which we do not fully understand. To that end, I try to remember the words of an unlikely capitalist: "It doesn't matter if a cat is black or white, so long as it catches mice."

-George

August 19, 2010

The Girls' Engineering Club

I remember when I got started in computing. For the morbidly curious it was officially "a long time ago" and I'm afraid that's all I'm going to say other than to note that a major source of inspiration for me was the movie TRON, or more specifically the computer graphics in that movie (naturally I'm looking forward to the release of the new TRON movie!).

Computers have come a long way since then and what they've gained in power, they've also lost in simplicity. To draw an analogy, the kids of my father's generation, who spent a lot of time in the garage tinkering with cars, would have to make a big technological leap before they could monkey with the guts of today's newfangled automobiles. In a similar fashion the computers of my era, with built in Integer BASIC and simple graphics modes, have given way to mouse-driven, fully graphical user interfaces of today. Where I started programming by entering a few lines of text at a prompt and watching my code spit out streams of text in return, these days an aspiring programmer has to create a significant chunk of code to put up a window into which they can display their results, before they can write the code that generates those results.

In short, there's a bit more of a learning curve to get started. While kids are a bit farther along when they start out, it doesn't hurt to give them a push where you can.

Several months ago, the counselor at the local elementary school called to invite my daughter to join a newly-formed Engineering Club for the girls in the fifth grade. My daughter had scored well in her math and science tests and they wanted her to be a part of a pilot program to help foster an interest in science and engineering. For various reasons (most having to do with bureaucracy) the school was unable to get the program off the ground. My wife, not wanting the girls to miss out on an opportunity, took the program off-campus and created an informal club, divorced from the school, and driven by the parents. The Girls Engineering Club was born.

The club has a dozen or so young ladies as members and since they're not tied to the school calendar, they have meet once or twice a month through the summer. In the club they explore applications of science, mathematics, and technology with a particular focus on experimentation. For example, the club formed shortly after the recent oil spill in the Gulf of Mexico. The girls spent their first meeting talking about what the professional engineers were doing at the time, and then trying to find ways to separate motor oil from water using things like sand, soap, coffee filters and dish soap. When I got home that day, I saw the aftermath. I hope the girls learned a lot... it was certainly clear that they had made a big mess and had a lot of fun.

It became my turn to help when the club took up the subject of Software Engineering. I'd like to say that the club leadership took me on because I have degrees in Computer Science and I'm a professional Software Engineer by trade. In truth, however, I think it was just my wife who thought I needed something better to do with my weekend than play video games. For whatever reason, however, I was pressed into service to teach the girls about Software Engineering.

Naturally I wanted to teach the girls a little bit about how engineering principles apply to the creation of software. But I imagine that a group of pre-teen women would find an hour and a half exposition on the subject at best half as exciting as that last sentence makes it sound. Moreover, these girls were used to hands-on engineering club meetings. If the girls didn't "get their hands dirty" with at least a little bit of programming, the meeting would be a bust. The problem was... How do you teach a dozen pre-teen girls about programming; and on a shoestring budget?

When I was taking computer science classes in school we had very expensive labs with carefully controlled software environments. For the club, each girl probably has a computer at their house, but I wasn't anxious to ask parents to pull them out of place, drag them somewhere that we could set them up, and then slog through the nightmare of trying to get a semi-uniform environment on them.

Instead, I gathered up my veritable museum of computer hardware. Using those that were only a few years old, and still capable of running Mac OS X, I pulled together three that could be wirelessly networked together and have their screens shared. It was a bit of an ad-hoc arrangement, but functional.

Next came the question of subject matter. In my daily life I work with a programming language called Objective-C. Objective-C is a really fun language, but it requires a pretty hefty tool chain to use effectively. I didn't want to burn a lot of my hour and a half with the girls teaching them about development tools... I wanted them writing code. Clearly Objective-C wasn't the answer.

A while back I read about a book called Learn to Program by Chris Pine. Mr. Pine had created a web site dedicated helping people who had never programmed before learn enough to get started. After the web site had been around a while, and after a bunch of folks had offered their comments and suggestions to improve it, he collected the information from the web site into the book.

The book uses a programming language called Ruby as its teaching tool. Ruby is a fantastic language. It's one of the so-called "fourth generation" scripting languages (along with Python, Perl, JavaScript, and others). The language was designed to scale from the needs of the novice programmer, up to the demands of the professional Software Engineer. For the girls in the club, however, the nice thing about Ruby is that it provides a "Run, Evaluate, Print, Loop" (REPL) tool called IRB (Interactive RuBy). Using IRB, you can type in a Ruby expression and see the results of executing that expression right away. This would provide the great hands-on experience I was looking for in a reasonably controlled environment. More importantly it would run (and run the same way) on my collection of rapidly-approaching-vintage hardware.

I wanted to get a copy of the book for the girls. The Pragmatic Programmers offers many of their books, including this one, in electronic formats (PDF and eBook). I contacted them about a volume or educational discount on a PDF copy of the book. A company representative was kind enough to donate the book for the girls in our club!! You could have knocked me over with a feather. That gift put the train on the track and the wheels in motion.

(In appreciation, let me mention that Learn To Program is available in its Second Edition from The Pragmatic Bookshelf today. This is not an official endorsement by SoftLayer, but it is an enthusiastic recommendation from your humble author who is very grateful for their generous gift).

In the end, the club meeting was on a very rainy day. We struggled to keep the computer equipment dry as we hauled it to the home of one of the club members. Their poor kitchen table became a tangle of cords carrying power and video signals. Using shared screens, and my iPad as a presentation controller, I walked the girls through a Keynote presentation about some of the of the basic concepts of Software Engineering. Then we fired up Ruby in IRB and I showed the girls how to work with numbers, variables, and simple control structures in Ruby. They had to sit three to a computer, but that also let them help one another out. They learned to use loops to print out silly things about me (for example, I had my computer print out "Mr. Thompson rocks!", the girls felt that they absolutely must get their computer to print "Mr. Thompson most certainly does not rock!" 1000 times). There was an awful lot of giggling, but as the teacher I was proud to see them pick up the basic concepts and apply them to their own goals. My favorite exclamation was "Wow! I could use this to help me with my homework."

As a Software Engineer, I spend an awful lot of my time sitting in front of a screen watching text scroll by. My colleges and I have meetings where we work together on hard problems and come up with creative solutions, but just as the computing environments of the day have become more complex, I've become a bit jaded to the discovery and wonder I enjoyed when I poked away at my computer keyboard all those years ago. One of the benefits of volunteering is not what you do for others, but what they can do for you. With the Girls Engineering Club, I got to experience a little of that joy of discovery once again. The price was a little elbow grease, some careful thought, and a bit of my time. It was absolutely a bargain.

Categories: 
June 4, 2008

Wait … Back up. I Missed Something!

I’ve been around computers all my life (OK, since 1977 but that’s almost all my life) and was lucky to get my first computer in 1983.

Over the summer of 1984, I was deeply embroiled in (up to that point) the largest programming project of my life, coding Z80 ASM on my trusty CP/M computer when I encountered the most dreaded of all BDOS errors, “BDOS ERROR ON B: BAD SECTOR”

In its most mild form, this cryptic message simply means “copy this data to another disk before this one fails.” However, in this specific instance, it represented the most severe case… “this disk is toast, kaputt, finito, your data is GONE!!!”

Via the School of Hard Knocks, I learned the value of keeping proper backups that day.
If you’ve been in this game for longer than about 10 milliseconds, it’s probable that you’ve experienced data loss in one form or another. Over the years, I’ve seen just about every kind of data loss imaginable, from the 1980’s accountant who tacked her data floppy to the filing cabinet with a magnet so she wouldn’t misplace it-- all the way to enterprise/mainframe class SAN equipment that pulverizes terabytes of critical data in less than a heartbeat due to operator error on the part of a contractor.

I’ve consulted with thousands of individuals and companies about their backup implementations and strategies, and am no longer surprised by administrators who believe they have a foolproof backup utilizing a secondary hard disk in their systems. I have witnessed disk controller failures which corrupt the contents of all attached disk drives, operator error and/or forgetfulness that leave gaping holes in so-called backup strategies and other random disasters. On the other side of the coin, I have personally experienced tragic media failure from “traditional backups” utilizing removable media such as tapes and/or CD/DVD/etc.

Your data is your life. I’ve waited up until this point to mention this, because it should be painfully obvious to every administrator, but in my experience the mentality is along the lines of “My data exists, therefore it is safe.” What happens when your data ceases to exist, and you become aware of the flaws in your backup plan? I’ll tell you – you go bankrupt, you go out of business, you get sued, you lose your job, you go homeless, and so-on. Sure, maybe those things won’t happen to you, but is your livelihood worth the gamble?

“But Justin… my data is safe because it’s stored on a RAID mirror!” I disagree. Your data is AVAILABLE, your data is FAULT TOLERANT, but it is not SAFE. RAID controllers fail. Disaster happens. Disgruntled or improperly trained personnel type ‘rm –rf /’ or accidentally select the wrong physical device when working with the Disk Manager in Windows. Mistakes happen. The unforeseeable, unavoidable, unthinkable happens.

Safe data is geographically diverse data. Safe data is up-to-date data. Safe data is readily retrievable data. Safe data is more than a single point-in-time instance.

Unsafe data is “all your eggs in one basket.” Unsafe data is “I’ll get around to doing that backup tomorrow.” Unsafe data is “I stored the backups at my house which is also underwater now.” Unsafe data is “I only have yesterday’s backup and last week’s backup, and this data disappeared two days ago.”

SoftLayer’s customers are privileged to have the option to build a truly safe data backup strategy by employing the Evault option on StorageLayer. This solution provides instantaneous off-site backups and efficiently utilizes tight compression and block-level delta technologies, is fully automated, has an extremely flexible retention policy system permitting multiple tiers of recovery points-in-time, is always online via our very sophisticated private network for speedy recovery, and most importantly—is incredibly economical for the value it provides. To really pour on the industry-speak acronym soup, it gives the customer the tools for their BCP to provide a DR scenario with the fastest RTO with the best RPO that any CAB would approve because of its obvious TCR (Total Cost of Recovery). Ok, so I made that last one up… but if you don’t recover from data loss, what does it cost you?

On my personal server, I utilize this offering to protect more than 22 GB of data. It backs up my entire server daily, keeping no less than seven daily copies representing at least one week of data. It backs up my databases hourly, keeping no less than 72 hourly copies representing at least three days of data. It does all this seamlessly, in the background, and emails me when it is successful or if there is an issue.

Most importantly, it keeps my data safe in Seattle, while my server is located in Dallas. Alternatively, if my server were located in Seattle, I could choose for my data to be stored in Dallas or our new Washington DC facility. Here’s the kicker, though. It provides me the ability to have this level of protection, with all the bells and whistles mentioned above, without overstepping the boundary of my 10 GB service. That’s right, I have 72 copies of my database and 7 copies of my server, of which the original data totals in excess of 22 GB, stored within 10 GB on the backup server.

That’s more than sufficient for my needs, but I could retain weekly data or monthly data without significant increase in storage requirements, due to the nature of my dataset.
This service costs a mere $20/mo, or $240/yr. How much would you expect to pay to be able to sleep at night, knowing your data is safe?

Are you missing something? Wait … Backup!

-Justin

May 31, 2008

Response to On Site Development

On May 14th my buddy Shawn wrote On Site Development. Aside from the ambiguous title (I originally thought it was an article on web site development, rather than the more appropriate on-site development), there were a number of things that I felt could be expanded upon. I started by simply commenting on his post, but the comment hit half a page and I had to admit to myself that I was, in fact, writing an entire new post.

Updating the computer systems in these restaurants is a question of scale. Sure, it seems cheap to update the software on the 6 computers in a local fast food restaurant. However, a certain “largest fast-food chain in the world” has 31,000+ locations (according to Wikipedia). Now I know how much I would charge to update greasy fast-food computers, and if you multiply that by 31,000, you get a whole lot of dollars. It just doesn’t scale well enough to make it worthwhile. The bottom line is, the companies do cost-benefit analysis on all projects, and the cost of re-doing the messed up orders is apparently less than the cost of patching the software on a quarter million little cash registers and kitchen computers.

It's the same logic that lead to Coke being sold for 5 cents for more than 60 years, spanning two world wars and the great depression without fluctuating in price. The vast majority of Coca-Cola during that time period was sold from vending machines. These vending machines only accepted nickels, and once a nickel was inserted, a Coke came out. That’s it. Nothing digital, no multi-coin receptacles, just insert nickel…receive Coke. The cost of replacing 100,000 vending machines was far higher than the profits they would get by increasing the price of coke slightly. Only after World War II, when industrialization and the suburb were really taking off, did Coca-Cola start to phase out their existing vending machine line and replace it with machines capable of charging more than 5 cents per bottle.

Of course, we all know how coke machines operate now. Computerized bill changers, many of them hooked up to the internet, allow Coke to charge upwards of $3 for a 20oz beverage on a hot day at a theme park. Coke even attempted (in 2005) to fluctuate the price of Coke based on local weather conditions. People would want a Coke more on a hot summer day, so why not charge more for it? (Because the public backlash was severe to the point where boycotts were suggested the very same day Coke announced their new plan, but that’s another story.)

The fast food problem Shawn mentioned, as well as the vending machine problem, is why so many companies are moving onto the web. Online retail is exploding at a rate that can be described as a “barely controlled Bubble.” To tie back in with my comments on the fast food restaurant, this means that all your customers see the exact same website, written by the exact same piece of code. Want to change the way orders are displayed? Well simply alter the order display page, and every customer in every country from now on will see that new display format.

This doesn’t just apply to retail, however. Many companies are moving towards web-based internal pages. When I got my mortgage, the load officer entered all my information into a web form on their intranet. This is brilliant, because it takes away all the cost of synchronizing the employee computers with the software, it removes the time needed for upgrades, and (most importantly) it means developers don’t have to come into the office at 4am to ensure that upgrades go smoothly before the start of the business day. So any of you business owners out there that have had to deal with the nightmare of upgrading antiquated POS software on dozens, hundreds, or hundreds of thousands of computers, consider making everything a web site.

SoftLayer has geographically diverse data centers, so your stores can always log in to a nearby servers to cut down on latency, and we allow for VPN access, distributed databases, and real-time backups, making a web-based solution preferable to even the hard coded local systems that many stores use now.

-Daniel

Categories: 
May 29, 2008

Plot Course to Vulcan, Warp Factor 8. Engage!

Resolutely pointing off into the starry void of space on the bridge of the Enterprise, klieg lights gleaming off his majestic dome, Captain Picard causes the Starship Enterprise to leap off on another mission. Once asked how the “warp drive” worked on Star Trek, Patrick Stewart claimed that “I say Engage and we go.” Best explanation of warp drive I’ve ever heard.

I find I miss my Linux install. Due to circumstances beyond my control (i.e. I’m too lazy to stop being lazy), and the fact that few games work well on Linux without lots of under-the-hood tweaking, I broke down and bought a Windows installation for my PC. In between mining asteroids in my Retriever Mining Ship and solving 3D puzzles with a transdimensional gun, I do normal work with my computer; programming, web design, web browsing, video editing, file management, the whole deal.

Windows Vista, however, has a new feature that makes my work awesome. No, I’m not talking about the 3D accelerated desktop with semitransparent windows (although that IS awesome). I’m talking about the new Start Menu search box.

In Windows XP (I’m doing this right now), hitting the Windows key opens up the start menu. I can either use the mouse to navigate the menu (why use the start key if you’re going to mouse the menu?), or navigate with the keyboard arrows. However, this can be quite tedious and slow. If I remember the program’s “.EXE” name and the program is on the Windows System Path, I can select “Run…” and type in the name, like wmplayer for Windows Media Player. But the names are funky and again, the cool programs aren’t on the path.

In Windows Vista, however, when you bump the start menu, a new device, the SEARCH BOX, is automatically engaged in the start menu! So, when I want to use, say, Notepad, I type ‘windows key notepad enter’. Goldwave (sound recording) is ‘windows key goldwave enter’. When I want to use a Open Office tool, I bump the Windows key, type “open office” and then select the tool I want with the arrow keys, as the search box narrows down the huge Start Menu to just the entries that make sense. Even cooler: when it’s budget time, I hit the Windows key then type “budget”. Search brings up “Apartment Budget.ods”. Select that with the arrow keys, and it opens Open Office Calc (spreadsheet) for me.

It’s like having a command line in Windows. Any program is just a few keystrokes away, and for a Linux nut and a touch typer like me, means that my computer is that much more efficient. I don’t need muscle memory with the mouse to navigate the start menu, I don’t have to squint at the menu items and find my program. I just have to remember the name!

Try it some time. It’s almost as awesome as saying “Engage” and going to Vulcan.

-Zoey

Categories: 
March 14, 2008

From the Outside Looking In

Recently, as you know, SoftLayer released the new API version 3. We have all been working very hard on it, and we've been completely immersed in it for weeks (months, for some of us). This means that, for the developers, we've been living and breathing API code for quite some time now. The time came to release the API, and as many of you know, it was a smashing success. However, we were lacking in examples for its use. Sure, we all had examples coming out our ears since the customer portal itself uses the API, but those were written by the same developers that developed the API itself, and therefore were still written from an insider's perspective.

So a call went out for examples. Many people jumped on the list, offering to write examples in a variety of languages. I thought I would tackle writing an API usage example in Perl. Perl, for those of you unfamiliar, is an infamous programming language. Flexible, confusing, fantastic and horrifying, it is the very embodiment of both "quick and dirty" and "elegance." It is well loved and well loathed in equal measure by the programming community. Nevertheless, I have some experience with Perl, and I decided to give it a try.

I will attempt to describe my thought process as I developed the small applications (which you should be able to locate shortly in the SLDN documentation wiki) throughout the work day.

9am: "Wow, I really don't remember as much Perl as I thought. This may be difficult."
10am: "I need to install SOAP::Lite, that shouldn't be hard."
11am: "Where the heck are they hiding SOAP::Lite? There are articles about it everywhere, but I can't actually find it or get it installed!"
12pm: "Ok, got SOAP::Lite installed, and my first test application works perfectly! Things are going to be ok! Wait…what's all this about authentication headers?"
1pm: "What have I done to deserve this? Why can't I pass my user information through to the API?"
2pm: "Aha! Another developer just wandered by and pointed out that I've been misspelling 'authentication' for 2 hours! Back on track, baby!" (Side note: another "feature" of Perl is how it never complains when you use variables that don't exist, it just assumes you never meant to type that. Of course, you could tell it to complain, but I forgot about that feature because I haven't used Perl in 4 years.)
3pm: I finally get example #1 working. It queries the API and shows a list of the hardware on your account.
3:30pm: Example #2 working, this shows the details for a single server, including datacenter and operating system
4pm: Combining examples #1 and #2, the third example shows all hardware on your account, plus the installed OS and datacenter, in a handy grid right on the command line. Success! I put Perl away, hopefully for another 4 years.

The whole experience, though, really gave me an insight into how fantastically awesome the API is. I was looking at it from an outsider's perspective. I was confused as to how everything worked, I was working with an unfamiliar language, and I was browsing through the API looking for anything that looked "cool and/or useful." Getting a list of all my account's hardware to show up in a custom built application that I wrote as if I knew nothing about the API was a great feeling. It showed that not only was the API perfectly suited to the tasks we expected of it, but even a novice developer could, with a little effort, make an API application like mine. Expanding on it to show more and more information, and all the possibilities that it opened up in my mind made me realize how useful this API is that we made. It's not just something that a small percentage of our customers will be using. It's something that is truly revolutionary, and that all clients can take advantage of. I'm assuming, of course, that all clients have at least rudimentary skill in at least one programming language, but given the level of success everyone has had with our other offerings, I can assume that assumption is accurate.

If you have been thinking recently "look at all the noise they've been making about this 'API' nonsense," I highly recommend dusting off an old programming book and at least looking at it once. Think of all the possibilities, all the custom reports that you can make for yourself, all the data that we have provided right at your fingertips to assemble in any way you wish. We try our best to make the portal useful to every customer, but we know that you can't please all the people all the time. But with the API, we may do just that. If you're the kind of customer that is only interested in outbound bandwidth by domain, write an API script that displays just that! If you want to know the current number of connections and CPU temperature of your load balanced servers, get that data and show it! The possibilities are endless, and we're improving the API all the time.

-Daniel

March 10, 2008

Everyone is so Helpful Around Here!

Here at SoftLayer, all the developers stay on Jabber all day so we’re all accessible. As you know, we’ve recently been doing a major piece of software development: the new API. During development, we all were trying to get things finished as well as continue our day to day operations.

No fewer than 4 times during the last 3 weeks I have had the following conversation with a fellow developer.

Me: Hey, I need you to do something on this API class you wrote.
Other developer: Ok, no problem.
Me: Wait, you didn’t write this, did you?
Other developer: No, I didn’t.

Each time, when faced with a request to understand and modify something that they didn’t write, in addition to their already overwhelming workload, my fellow developers were more than happy to accept the new task.

It just goes to show what kind of environment we all work in here. Everyone is always willing to help, and it’s this attitude that allowed us to develop the API so quickly with such robust features. Each developer was willing to help the others, and that resulted in a tightly integrated product that we’re all very proud of.

The same sort of attitude pervades all of SoftLayer. I have had help on tasks from Networking, Accounting, and Sales since I got here, and each time everyone is more than happy to help out. The end result is, of course, that the customer gets their problems solved faster, and gets higher quality services out it the deal.

But really, fellow developers, if you’re reading this also, it’s acceptable to say “I didn’t write that” when I ask you to change it. I won’t be offended. Half the problem is that we have 5 developers with names starting with J, I just clicked the wrong guy!

-Daniel

Subscribe to programming