How Clouds Killed The PC

August 3, 2010

Most days, it seems that technology progresses all too slowly. It is a different feeling when you work with cutting edge technology on a daily basis: deploying the first dual network datacenter infrastructure, being entrenched in solutions for everything from CDN to ISCI to DTS and more, testing the latest enterprise solutions from leading industry vendors long before money could buy them… it never really meant a whole lot to me; it was very much just, “How we roll”, as the gang would say.

But every so often, there is a day when a new technology catches my attention and reminds me why I got involved in the IT industry. Something that reminds me of the days spent tapping out QuickBasic 2.0 applications on my 18MHz 386 and 16 color EGA monitor. Surprisingly, the rise of cloud computing did just that. There was a day some still remember, when the cost of localized hardware was significant enough that terminals ruled the world. Occasionally, you may still see one at a grocery checkout stand or being used in a retail stockroom to check inventory across locations. Early terminals were commonly thin clients lacking a processor, non-volatile user storage, and only possessing enough memory to display what was on the screen at any given time. As the cost of memory declined, fat clients gained some popularity offering locally programmable memory. However, the concept was still the same: one host machine, usually a mainframe, serving applications over a distance to multiple (less capable) client machines.

Terminals were not destined to last though. In a twist of irony one of the innovations that they helped to inspire, the microprocessor, combined with the falling price and increased capacity of memory eventually led the decline of terminals. Left behind, in a cloud of dust, by hardware manufacturer’s race for speed capacity combined with advances in networking technology, the terminal PC became a historical relic looked upon as a necessary stop-gap solution used in the days when hardware was just too-darn-expensive. It was at that time the truly personal computer that we know and love was born and has forever-since reigned supreme. Then came the ARPANET, which gave way to the Information Super Highway, gave way to the World Wide Web, gave way to the internet we know today.

Mainframes gave way to servers. And today, I walk into a datacenter surrounded by servers boasting quad octo-core processors and Cloud Computing Instances, talking to customers who use their smart-phones to remotely access their web hosts, and quietly thinking to myself, “Have things really changed?” How far off is the day, when the benefits of remotely hosted applications outweigh the benefits of localized hardware? When we sit at the start of a new era where CCI’s can be created in minutes, regularly imaged for data security, migrated and restored quickly in the event of hardware failure, accessed from anywhere and from a variety of client hardware and software implementations, how much more would it take for us to return to the days of terminal PC’s. As bandwidth continues to improve, purchase and operational costs per processing core continues to fall, people demand more and more ‘anywhere access’, open source gains popularity and the idea of renting freely upgraded applications becomes accepted outside of the IT community, who knows what the future might hold. In a future where the concept of parallel uplinks may be no more foreign than that of parallel data transfer over CAT6 is to the layman, I wonder if personal computers will be thought of as the necessary stop-gap solution used while we waited for bandwidth to catch up to useable processing power; nothing more than a dinosaur that gave way to the green-movement and our need to be connected everywhere.

While I work on bringing my head out of the clouds, I remember why I am here. I am not here because technology’s past was all that fantastic, or because the present is all that glamorous, but because the future is still wide open. Whether-or-not clouds ever really kill the PC is anyone’s guess and only time will tell. However, one thing is currently known, as companies continue to see the benefit of having their staff conduct business through a web-portal interface, consumers continue trying to figure out what they are going to do with the extra two or three of the four cores they have, and the cost-to-performance ratio associated with remote resources continues to fall, we are steadily moving that way.

Categories: