Posts Tagged 'Efficiency'

April 17, 2012

High Performance Computing for Everyone

This guest blog was submitted by Sumit Gupta, senior director of NVIDIA's Tesla High Performance Computing business.

The demand for greater levels of computational performance remains insatiable in the high performance computing (HPC) and technical computing industries, as researchers, geophysicists, biochemists, and financial quants continue to seek out and solve the world's most challenging computational problems.

However, access to high-powered HPC systems has been a constant problem. Researchers must compete for supercomputing time at popular open labs like Oak Ridge National Labs in Tennessee. And, small and medium-size businesses, even large companies, cannot afford to constantly build out larger computing infrastructures for their engineers.

Imagine the new discoveries that could happen if every researcher had access to an HPC system. Imagine how dramatically the quality and durability of products would improve if every engineer could simulate product designs 20, 50 or 100 more times.

This is where NVIDIA and SoftLayer come in. Together, we are bringing accessible and affordable HPC computing to a much broader universe of researchers, engineers and software developers from around the world.

GPUs: Accelerating Research

High-performance NVIDIA Tesla GPUs (graphics processing units) are quickly becoming the go-to solution for HPC users because of their ability to accelerate all types of commercial and scientific applications.

From the Beijing to Silicon Valley — and just about everywhere in between — GPUs are enabling breakthroughs and discoveries in biology, chemistry, genomics, geophysics, data analytics, finance, and many other fields. They are also driving computationally intensive applications, like data mining and numerical analysis, to much higher levels of performance — as much as 100x faster.

The GPU's "secret sauce" is its unique ability to provide power-efficient HPC performance while working in conjunction with a system's CPU. With this "hybrid architecture" approach, each processor is free to do what it does best: GPUs accelerate the parallel research application work, while CPUs process the sequential work.

The result is an often dramatic increase in application performance.

SoftLayer: Affordable, On-demand HPC for the Masses

Now, we're coupling GPUs with easy, real-time access to computing resources that don't break the bank. SoftLayer has created exactly that with a new GPU-accelerated hosted HPC solution. The service uses the same technology that powers some of the world's fastest HPC systems, including dual-processor Intel E5-2600 (Sandy Bridge) based servers with one or two NVIDIA Tesla M2090 GPUs:

NVIDIA Tesla

SoftLayer also offers an on-demand, consumption-based billing model that allows users to access HPC resources when and how they need to. And, because SoftLayer is managing the systems, users can keep their own IT costs in check.

You can get more system details and pricing information here: SoftLayer HPC Servers

I'm thrilled that we are able to bring the value of hybrid HPC computing to larger numbers of users. And, I can't wait to see the amazing engineering and scientific advances they'll achieve.

-Sumit Gupta, NVIDIA - Tesla

January 3, 2012

Hosting Resolutions for the New Year

It's a new year, and though only real change between on January 1 is the last digit in the year, that change presents a blank canvas for the year. In the past, I haven't really made New Year's resolutions, but because some old Mayan calendar says this is my last chance, I thought I'd take advantage of it. In reality, being inspired to do anything that promotes positive change is great, so in the spirit of New Year's improvements, I thought I'd take a look at what hosting customers might want to make resolutions to do in 2012.

What in your work/hosting life would you like to change? It's easy to ignore or look past small goals and improvements we can make on a daily basis, so let's take advantage of the "clean slate" 2012 provides us to be intentional about making life easier. A few small changes can mean the difference between a great day in the office or a frantic overnight coffee binge (which we all know is so great for your health). Because these changes are relatively insignificant, you might not recognize anything in particular that needs to change right off the bat. You might want to answer a daunting question like, "What should you do to improve your work flow or reduce work related stress?" Luckily, any large goals like that can be broken down into smaller pieces that are much easier to manage.

Enough with the theoretical ... let's talk practical. In 2012, your hosting-related New Year's resolutions should revolve around innovation, conservation, security and redundancy.

Innovation

When it comes to hosting, a customer's experience and satisfaction is the most important focus of a successful business. There's an old cliche that says, "If you always do what you've always done, you'll always get what you've always gotten," and that's absolutely correct when it comes to building your business in the new year. What can you change or automate to make your business better? Are you intentionally "thinking outside the box?"

Conservation

The idea of "conservation" and "green hosting" has been written off as a marketing gimmick in the world of hosting, but there's something to be said for looking at your utilization from that perspective. We could talk about the environmental impact of hosting, and finding a host that is intentional about finding greener ways to do business, but if you're renting a server, you might feel a little disconnected from that process. When you're looking at your infrastructure in the New Year, determine whether your infrastructure is being used efficiently by your workload. Are there tools you can take advantage of to track your infrastructure's performance? Are you able to make changes quickly if/when you find inefficiencies?

Security

Another huge IT-related resolution you should make would be around security. Keeping your system tight and locked up can get forgotten when you're pushing development changes or optimizing your networking, so the beginning of the year is a great time to address any possible flaws in your security. Try to start with simple changes in your normal security practices ... Make sure your operating systems and software packages are regularly patched. Keep a strict password policy that requires regular password updates. Run system log checks regularly. Reevaluate your system firewall or ACL lists.

All of these safety nets may be set up, but they may not be functioning at their best. Even precautions as simple as locking your client or workstation when not in use can help stop attacks from local risks and prying eyes ... And this practice is very important if you keep system backups on the same workstations that you use. Imagine if someone local to your workstation or client was able to retrieve your backup file and restore it ... Your security measures would effectively be completely nullified.

Redundancy

Speaking of backups, when was your most recent backup? When is your next backup? How long would it take you to restore your site and/or data if your current server(s) were to disappear from the face of the Earth? These questions are easy to shrug off when you don't need to answer them, but by the time you do need to answer them, it's already too late. Create a backup and disaster recovery plan. Today. And automate it so you won't have the ability to forget to execute on it.

Make your objectives clear, and set calendar reminders throughout the year to confirm that you're executing on your goals. If some of these tasks are very daunting or difficult to implement in your current setup, don't get discouraged ... Set small goals and chip away at the bigger objective. Progress over time will speak for itself. Doing nothing won't get you anywhere

Happy New Year!

-Jonathan

December 8, 2011

UNIX Sysadmin Boot Camp: bash - Keyboard Shortcuts

On the support team, we're jumping in and out of shells constantly. At any time during my work day, I'll see at least four instances of PuTTY in my task bar, so one thing I learned quickly was that efficiency and accuracy in accessing ultimately make life easier for our customers and for us as well. Spending too much time rewriting paths, commands, VI navigation, and history cycling can really bring you to a crawl. So now that you have had some time to study bash and practice a little, I thought I'd share some of the keyboard shortcuts that help us work as effectively and as expediently as we do. I won't be able to cover all of the shortcuts, but these are the ones I use most:

Tab

[Tab] is one of the first keyboard shortcuts that most people learn, and it's ever-so-convenient. Let's say you just downloaded pckg54andahalf-5.2.17-v54-2-x86-686-Debian.tar.gz, but a quick listing of the directory shows you ALSO downloaded 5.1.11, 4.8.6 and 1.2.3 at some point in the past. What was that file name again? Fret not. You know you downloaded 5.2.something, so you just start with, say, pckg, and hit [Tab]. This autocompletes everything that it can match to a unique file name, so if there are no other files that start with "pckg," it will populate the whole file name (and this can occur at any point in a command).

In this case, we've got four different files that are similar:
pckg54andahalf-5.2.17-v54-2-x86-686-Debian.tar.gz pckg54andahalf-5.1.11-v54-2-x86-686-Debian.tar.gz
pckg54andahalf-4.8.6-v54-2-x86-686-Debian.tar.gz
pckg54andahalf-1.2.3-v54-2-x86-686-Debian.tar.gz

So typing "pckg" and hitting [Tab] brings up:
pckg54andahalf-

NOW, what you could do, knowing what files are there already, is type "5.2" and hit [Tab] again to fill out the rest. However, if you didn't know what the potential matches were, you could double-tap [Tab]. This displays all matching file names with that string.

Another fun fact: This trick also works in Windows. ;)

CTRL+R

[CTRL+R] is a very underrated shortcut in my humble opinion. When you've been working in the shell for untold hours parsing logs, moving files and editing configs, your bash history can get pretty immense. Often you'll come across a situation where you want to reproduce a command or series of commands that were run regarding a specific file or circumstance. You could type "history" and pore through the commands line by line, but I propose something more efficient: a reverse search.

Example: I've just hopped on my system and discovered that my SVN server isn't doing what it's supposed to. I want to take a look at any SVN related commands that were executed from bash, so I can make sure there were no errors. I'd simply hit [CTRL+R], which would pull up the following prompt:

(reverse-i-search)`':

Typing "s" at this point would immediately return the first command with the letter "s" in it in the history ... Keep in mind that's not just starting with s, it's containing an s. Finishing that out to "svn" brings up any command executed with those letters in that order. Pressing [CTRL+R] again at this point will cycle through the commands one by one.

In the search, I find the command that was run incorrectly ... There was a typo in it. I can edit the command within the search prompt before hitting enter and committing it to the command prompt. Pretty handy, right? This can quickly become one of your most used shortcuts.

CTRL+W & CTRL+Y

This pair of shortcuts is the one I find myself using the most. [CTRL+W] will basically take the word before your cursor and "cut" it, just like you would with [CTRL+X] in Windows if you highlighted a word. A "word" doesn't really describe what it cuts in bash, though ... It uses whitespace as a delimiter, so if you have an ultra long file path that you'll probably be using multiple times down the road, you can [CTRL+W] that sucker and keep it stowed away.

Example: I'm typing nano /etc/httpd/conf/httpd.conf (Related: The redundancy of this path always irked me just a little).
Before hitting [ENTER] I tap [CTRL+W], which chops that path right back out and stores it to memory. Because I want to run that command right now as well, I hit [CTRL+Y] to paste it back into the line. When I'm done with that and I'm out referencing other logs or doing work on other files and need to come back to it, I can simply type "nano " and hit [CTRL+Y] to go right back into that file.

CTRL+C

For the sake of covering most of my bases, I want to make sure that [CTRL+C] is covered. Not only is it useful, but it's absolutely essential for standard shell usage. This little shortcut performs the most invaluable act of killing whatever process you were running at that point. This can go for most anything, aside from the programs that have their own interfaces and kill commands (vi, nano, etc). If you start something, there's a pretty good chance you're going to want to stop it eventually.

I should be clear that this will terminate a process unless that process is otherwise instructed to trap [CTRL+C] and perform a different function. If you're compiling something or running a database command, generally you won't want to use this shortcut unless you know what you're doing. But, when it comes to everyday usage such as running a "top" and then quitting, it's essential.

Repeating a Command

There are four simple ways you can easily repeat a command with a keyboard shortcut, so I thought I'd run through them here before wrapping up:

  1. The [UP] arrow will display the previously executed command.
  2. [CTRL+P] will do the exact same thing as the [UP] arrow.
  3. Typing "!!" and hitting [Enter] will execute the previous command. Note that this actually runs it. The previous two options only display the command, giving you the option to hit [ENTER].
  4. Typing "!-1" will do the same thing as "!!", though I want to point out how it does this: When you type "history", you see a numbered list of commands executed in the past -1 being the most recent. What "!-1" does is instructs the shell to execute (!) the first item on the history (-1). This same concept can be applied for any command in the history at all ... This can be useful for scripting.

Start Practicing

What it really comes down to is finding what works for you and what suits your work style. There are a number of other shortcuts that are definitely worthwhile to take a look at. There are plenty of cheat sheets on the internet available to print out while you're learning, and I'd highly recommend checking them out. Trust me on this: You'll never regret honing your mastery of bash shortcuts, particularly once you've seen the lightning speed at which you start flying through the command line. The tedium goes away, and the shell becomes a much more friendly, dare I say inviting, place to be.

Quick reference for these shortcuts:

  • [TAB] - Autocomplete to furthest point in a unique matching file name or path.
  • [CTRL+R] - Reverse search through your bash history
  • [CTRL+W] - Cut one "word" back, or until whitespace encountered.
  • [CTRL+Y] - Paste a previously cut string
  • [CTRL+P] - Display previously run command
  • [UP] - Display previously run command

-Ryan

June 28, 2011

Modern Website Design: Layout

There have been many books written about website design, and I am not about to take on the challenge of disputing any of them or trying to explain every facet of design. In this short blog, I want to explain what I have come to understand as the modern layout of websites. The term "layout" may have many different definitions, but for this article I am talking about the basic structure of your website, meaning separation of concerns, data transfer from host to client, how to handle changes in data, and when to change your page structure.

Separation of Concerns

It is important when sitting down for the first time to build a website to come up with an outline. Start by making a list of the parts of your website and the functions of those parts. I always start at the base of my web structure and work from there. HTML is always the foundation of a website; it defines the structure and outlines how you will display your data – plain and simple. It doesn't have to include data or styles, nor does it need to be dynamic ... At its essence, it's a static file that browsers can cache.

Client-side scripting languages like JavaScript will take care of client-side animations and data dispersal, while cascading style sheets (CSS) take care of style and presentation, and server-side scripting languages like PHP or Perl can take care of data retrieval and formatting.

Data Transfer

Where is your data going to come from, and what format it will be in when the client receives it? Try to use a data format that is the most compatible with your scripting languages. I use JavaScript as my primary client side scripting program, so I try to use JSON as my data format, but that's not always possible when dealing with APIs and transferring data from remote computers. JSON is quickly becoming a standard data format, but XML* is the most widely accepted format.

I prefer to use REST APIs as much as possible, because they sends the information directly on the client, rather than using the server as a proxy. However, if a REST API is not available or if there is a security risk involved, you get the advantage of being able to format the data on the server before pushing it to the client. Try to parse and format data as little as possible on the client side of things, the client should be concerned with placing data.

Changes in Data

In the past, websites were made from multiple HTML documents, each one containing different data. The structure of the pages were the same though, so the data changed, but the code was nearly identical. Later, using server side scripting programs, websites became more dynamic, displaying different data based on variables passed in the URL. Now, using AJAX or script injection, we can load new data into a static webpage without reloading. This means less redundant code, less load on the client, and better overall performance.

Page Structure

It is important when displaying data to understand when to change the structure of the page. I start by creating a structure for my home page - it needs to be very open and unrestricting so I can add pictures and text to build the site. Once the overall loose structure is established, I create a structure for displaying products (this will be more restrictive, containing tables and ordering tools). The idea is to have as few HTML structures as possible, but if you find that your data doesn't fit or if you spend a lot of time positioning your data, then it might be time to create a new structure.

The Impact of a Modern Layout

Following these steps will lead to quicker, more efficient websites. This is (of course) not a new subject, and further understanding of web layout can be found in Model-View-Controller frameworks. If you find that you spend too much time writing code to interface with databases or place data, then frameworks are for you.

-Kevin

*If you have to deal with XML, make sure to include JavaScript libraries that make it easier to parse, like JQuery.

June 15, 2011

Relenta: Tech Partner Spotlight

We invite each of our featured SoftLayer Tech Marketplace Partners to contribute a guest post to the SoftLayer Blog, and this week, we're happy to welcome Dmitri Eroshenko from Relenta. In his guest post, Dmitri explains Relenta's inspiration and history to help you better understand how Relenta's online app can benefit your business.

Relenta

Company Website: http://www.relenta.com
Tech Partners Marketplace: http://www.softlayer.com/marketplace/relenta

Relenta: Get Things Done with One Click

We're all suffocating from information clutter. Our customer data and communications are scattered all over the place — multiple email accounts, social networks, CRMs and contact managers, instant messengers and chats, spreadsheets, various productivity and collaboration apps, calendars, and so on. We enter and re-enter data in different apps, which we endlessly cross-reference to reconcile discrepancies. We worry constantly that we're missing something.

At some point, we reach the threshold where pain becomes unbearable, stop and say, "There must be a better way!"

Our small software development team started working on Relenta six years ago with these very words. The idea was to take several apps our team used regularly — including email, of course — and distill them into one single program. Soon after we started building the program, we realized that by storing different types of customer records in the same backend database, we'd actually only begun the process of consolidating the information ... And that's where we started building Relenta's interface to truly streamline the process.

Instead of displaying various bits of customer information on separate screens, we created an interface that aggregated ALL data in one single activity stream. These "news feeds" provide at-a-glance views on the history of each of the customer relationships being tracked by the system. The feeds also put you in a one-click zone, from which no information is more than a single click away and no activity takes more than a single click to perform.

The rest is history. Today, Relenta is an elegant online application that lets you organize your entire customer-related life so that nothing is more than one click away.

The idea of building our platform around the one-click zone became our mantra and guiding principle. To put you into a one-click zone,
Relenta offers:

  • A unified inbox for all customer communications, including email and social network messages from LinkedIn, Facebook, and Twitter
  • A centralized platform for contact management, shared calendar, internal messaging, workflow management and document management
  • A built-in email marketing and email-autoresponder solution
  • A product philosophy that emphasizes disciplined process management and minimizes the number of steps it takes to get things done
  • A framework that enables asynchronous and geographically dispersed collaboration by keeping everyone and everything on the same page

As a result of this streamlined workflow, your data isn't fragmented or unnecessarily duplicated across your systems and you can be more efficient in your operations. By interlinking all communication activity between our team and each customer, we found ourselves getting twice as much work done in half the time.

If you find yourself bouncing between platforms to manage your customer relationships, Relenta might be a great fit for you. While I can talk about the value Relenta can provide and send you as many customer testimonials as you want to read, what matters is whether the app meets your needs. Check out our Live Demo and sign up for a Free Trial to put us to the test.

-Dmitri Eroshenko, Relenta

Join the one-click revolution at www.relenta.com!

January 19, 2011

AJAX Without XML HTTP Requests

What is AJAX?

Asynchronous JavaScript and XML - AJAX - is what you use to create truly dynamic websites. Ajax is the bridge between application and presentation layers, facilitating lightning fast, instant application of data from the end user to the host and back to the end user. It dynamically changes the data displayed on the page without disrupting the end user or bogging down the client. Although the name is misleading, it is used as a term for any process that can change the content of a web page without unnecessarily reloading other parts of the page.

What are XML HTTP requests?

Passing information from your server to your end user's browser is handled over HTTP in the form of HTML. The browser then takes that info and formats it in a way the end user can view it easily. What if we want to change some of the data in the HTML without loading a whole new HTML document? That's where XML comes in. Your web page needs to tell the browser to ask for the XML from the server; luckily, all browsers have a function called XmlHttpRequest(). Once it's called, it will poll the server for XML data.

Why shouldn't you use XML HTTP requests?

A long time ago, in a galaxy far, far away, Microsoft invented the XmlHttpRequest() object for Microsoft Exchange Server 2000. As with all first generation technologies, everyone wanted to use it, and some people implemented it differently. IE didn't even have native support until 2006, and there are still some discrepancies in various browsers when studying the OnReadyStateChange event listener. There is also an issue with cross-domain requests. When the internet was young, JavaScript hackers would steal users' identity by pulling information from secure websites and posting it to their own, stealing bank account numbers, credit cards, etc. Now that the internet has grown up a bit, people with large networks and many servers have found use for sending data across domains, but it's still not possible with XML HTTP requests.

What's an Alternative?

Using JavaScript, you can create client side scripts whose source is built with server side scripts, passing variables in the URL. Here's an example of a basic web page with local JavaScript, a few checkboxes for human interaction, and a table with some information that we want to change. View source on the page below to see the outline.


Looking at the three JavaScript functions, the first (clearTags) automatically clears out the checkboxes on load, the second (check(box)) makes sure that only one box is checked at a time, the third (createScript) is the interesting one; it uses the createElement() function to create an external JavaScript, the source of which is written in PHP. I have provided a sample script below to explain what I mean. First, we get the variable from the URL using the $_GET super global. Then, we process the variable with a switch, but you might use this opportunity to grab info from a database or other program. Finally, we print code which the browser will translate to JavaScript and execute.

<code>&lt;?PHP
//First we get the variable from the URL
$foo=$_GET['foo'];
//Here's the switch to process the variable
switch ($foo){
case 'foo' : print "var E=document.getElementById('data'); E.innerHTML='bar'; "; break;
case 'fooo' : print "var E=document.getElementById('data'); E.innerHTML='barr'; "; break;
case 'ffoo' : print "var E=document.getElementById('data'); E.innerHTML='baar'; "; break;
case 'ffooo' : print "var E=document.getElementById('data'); E.innerHTML='baarr'; "; break;
default : print "var E=document.getElementById('data');
E.innerHTML='unknown'; ";
}
?&gt;
</code>

-Kevin

November 29, 2010

Fun with Lists!

Back when I was doing research for my interview with SoftLayer, one of the things I looked for was financial data. Since SoftLayer isn’t a public company, I couldn’t get financial statements. However, I did find some nice round numbers in a press releases that said they did about $110 million in annual revenue. I thought, hey that’s not bad… Then I kept reading and when I saw that there were 170 employees I became impressed. For those without a calculator handy, at these numbers SoftLayer does about $647,000 in revenue a year per employee.

Because people loves lists, I looked up a few other company’s revenue / employee. These are in no particular industry and have nothing specific in common other than that they were the first to come to my mind.

CompanyRevenue Per Employee
Exxon $3,235,638
Amazon $1,266,667
Google $1,180,832
Toyota $764,216
Microsoft $702,022
SoftLayer $647,059
Nike $563,663
Intel $523,133
AT&T $463,656
American Express $416,295
Dreamworks $335,052
Anheuser-Busch $315,172
New York Times $314,416
Oracle $283,048
IBM $238,541
Rackspace $232,512
Whole Foods $203,256
Walmart $198,410

Does anyone else think gas prices could be lower?

Note: The data for revenue and number of employees was either pulled for public press releases or by looking at the Company Profile (# employees) and Key Statistics (Revenue) on Yahoo! Finance.

-Bradley

Categories: 
October 4, 2010

SoftLayer Fire Hose

Hi. My name is Mark Quigley, and I am a new Softlayer employee. In specific, I will be running the company’s analyst relations program. This is my first week with the company, and the fire hose has not yet been turned off. In fact, I think that this has been among the most intense weeks of my working life.

Softlayer moves at a pace that I am not overly familiar with given time I have spent with some very large (and inevitably slow moving) companies. It has been a pleasure to find myself in a group of 'quick-thinking doers' versus 'thinkers that spend too much time thinking and not enough time doing.' I have seen fewer PowerPoint decks and Excel spreadsheets this week than I thought was possible. It makes for a pleasant change, and change is a good thing (My wardrobe has also undergone a SoftLayer transformation. It now features black shirts and some more black shirts).

The week began with the announcement that SoftLayer had launched its second Dallas data center. The data center (DAL05) has capacity for 15,000 servers, delivers 24x7 onsite support, and has multiple security protocols controlling entrance to the facility. The diesel generators that sit outside are massive – think of a locomotive on steroids. DAL05 is fully connected to SoftLayer's data centers at the INFOMART in Dallas, in Seattle, Washington, and in the Washington D.C. area in addition to the company’s network Points of Presence in seven additional U.S. cities.

The reason for the expansion is simple – Softlayer continues to grow. In fact, our new office location would appear to be mostly a home for large generators and server racks in the future than it is for people (there are more of those to come, too). Current plans call for the addition of two more pods to DAL05 to come alive over the next 18 - 24 months. In addition a facility in San Jose is expected to go live early in 2011 and we are in the midst of international expansion plans. There is a lot going on around here.

I think it is interesting to step back for a second and take a look at what is driving this growth. The fact that SoftLayer is ruthlessly efficient, allowing customers to get from 0 to 60 faster than anyone else is certainly one reason. So are the fantastic support processes that are in place. The guys around here are very good at what they do. That being said this is a time when a rising tide is raising all ships. And this is a good thing. I mean, we want to beat our competition with every time we see them across the table, but we are glad that there are enjoying their share of success because it means the marketplace is booming. Even better, it is showing no sign of letting up.

The changes that we have witnessed in the past fifteen years are nothing short of staggering. I remember sending faxes to clients as the primary means of document exchange and then being thrilled at the notion of a single AOL account via dial up being shared by five people in the office. Now I have access to the internet via at least two devices in the office and one when I am not. At home I surf the net and watch content streamed via NetFlix over my iPad. My son plays the PS3 online with his pals, my daughter spends time watching Dora the Explorer on the Nick Jr. website and my wife has reopened countless friendships with high school friends that she has not seen in decades via Facebook. I don't think that I am unusual in my habits either. None of this happened ten years ago.

The most recent wave has come with the arrival of social networking sites (which had a much different definition when I was young!) and associated applications. Companies like Twitter and Facebook has driven a terrific amount of innovation, and continues to do so. So too have companies like Apple – music downloads and application downloads are now in the billions. The net result of this has been in a terrific amount of business for companies like SoftLayer. I mean, who ever thought that on-line farming would drive as much interest, traffic and money as it has? And the really cool part of all of this is that the world my kids will occupy in ten years is going to be richer than mine by at least an order of magnitude. SoftLayer will be there to make it all work. It is going to be a fun ride.

-@quigleymar

October 9, 2009

Facebook games, the datacenter, and you – film at 11

Ok, I admit it. I am addicted to Facebook games. For those of you who are a bit “long in the tooth” you might remember a series of games from a certain era where all you did was walk around and try to figure “it” out, but you really didn’t know what “it” was. Zork for instance was my favorite. In Zork you simply walked around and talked to people, touched walls and things rumbled, and picked up and dropped items. etc. Now don’t misunderstand, you didn’t see this happen, it was all in your head because the only thing on the screen was text. Think of it like the hit TV show LOST in text and you were John Locke. Are you LOST yet? Here is an example:

Facebook has taken us back to the world of Zork but now you can almost see what is going on. Let’s use the early on Mobster style games as example number one. They were sleek and simple; do a job, fight someone, whack someone on the hitlist, write a script, find a bot to do it all for you and become a “made man”. Now, the main idea in these games is ad generation and page views, so when the techies of the world figured out how to cheat, um I mean make the game more efficient, it was time to add some new ideas to the games to keep you more in tune to your monitor and the ads on the page instead of your bot! Enter the flash games, they are shiny and I like shiny things! Maybe the word should be polished. There are a few farm simulation games that are very popular. A couple of them have over 18 million monthly active users. Who would have thought that everyone in the world wanted to move to Texas and become a veggie farmer, or berries, or raise animals and fruit trees? I have to say that the new games are to carpel tunnel as Krispy Kreme is to clogged arteries. You have to click and then click a little more and then even a little more. You have to do tasks, so you can do jobs, so you can move up in levels so you can do more tasks to do even more jobs to make more money and it just keeps getting more involved. Maybe there is a flash automation system out there I can find to do it for me!

I am going back to the farm idea for a minute. When I started out I had a couple of small plots and I would plant different crops. I had a few animals walking around and a fruit tree or two, some fences, some green space in between and flowers. I began to notice that some of the extra shiny things got in the way and made my farm very inefficient. I began to streamline, one crop, no green space because that is just wasted, no animals, just plant the whole screen, harvest and plow, rinse and repeat. It is now very profitable, easy to manage and I don’t have to worry about this crop will be ready in 2 hours, that crop will be ready in 2 days, etc. It just works!

So I have just described SoftLayer to you in a nutshell. At first we tried many things, streamlined it, got it down to a very efficient science automated “it” and then wrapped products around “it”. Our products are shiny, we don’t waste space, we have one crop, and it just works!

June 8, 2009

Does College Really Prepare You for the Real World?

As I am entering my final semester of college, SoftLayer has given me the opportunity to experience what it's like to have a "real job." I very am lucky to have the chance to work for a great company and gain valuable work experience before I graduate. Although, I have only been here for a little over a week, it is very exciting to be a part of a hardworking team and innovative company. Everybody in the office is a strong believer in Softlayer, and that is why they are here.

The question at hand is: Does college prepare you for the real world? The obvious answer should be yes. We spend four or more years of our life at universities and colleges, and most of us are still in debt for it. I sometimes wondered how Aristotle or The Canterbury Tales had any application to my future career. Although many of the courses we studied outside our majors seemed irrelevant, I see now that we did learn something from it. We learned how to meet deadlines and work diligently. College is strenuous for a reason, and now that I have been a part of the work force, I understand this. Being able to complete college coursework proves to employers that you have the ability to learn and take on large tasks.

There are many aspects of college that have definitely prepared me for this job. The most important skill I have gained from college so far has been working with Excel. Being a market analyst, I spend most of my days in excel spreadsheets. College has also helped me gain a sense of independence and responsibility, two very important attributes for an efficient employee. Your boss needs to trust you not only to get the tasks done, but to get them done well, and professors do not hold you to any lesser standards. During college, there are also many essential lessons learned outside the classroom like learning to deal with roommates, getting along with a diverse group of people, paying bills on time, and being punctual.

In conclusion, college does prepare us for the real world. Sometimes I would sit in class and say to myself “I’m never going to use this”, and I am sure I was not the only one. The most important thing I took from college is to work hard. Sometimes your boss will ask you do things that you do not want to do, but that is life. Life takes hard work, and hard work will let you experience the best things in life that you value.

Subscribe to efficiency