Posts Tagged 'Guest Blog'

May 8, 2014

SoftLayer Security: Questions and Answers

When I talk to IBM Business Partners about SoftLayer, one of the most important topics of discussion is security. We ask businesses to trust SoftLayer with their business-critical data, so it’s important that SoftLayer’s physical and network security is as transparent and understandable as possible.

After going through the notes I’ve taken in many of these client meetings, I pulled out the ten most frequently asked questions about security, and I’ve compiled answers.

Q1: How is SoftLayer secured? What security measures does SoftLayer have in place to ensure my workloads are safe?

A: This “big picture” question is the most common security-related question I’ve heard. SoftLayer’s approach to security involves several distinct layers, so it’s tough to generalize every aspect in a single response. Here are some of the highlights:

  • SoftLayer’s security management is aligned with U.S. government standards based on NIST 800-53 framework, a catalog of security and privacy controls defined for U.S. federal government information systems. SoftLayer maintains SOC 2 Type II reporting compliance for every data center. SOC 2 reports are audits against controls covering security, availability, and process integrity. SoftLayer’s data centers are also monitored 24x7 for both network and on-site security.
  • Security is maintained through automation (less likely for human error) and audit controls. Server room access is limited to authorized employees only, and every location is protected against physical intrusion.
  • Customers can create a multi-layer security architecture to suit their needs. SoftLayer offers several on-demand server and network security devices, such as firewalls and gateway appliances.
  • SoftLayer integrates three distinct network topologies for each physical or virtual server and offers security solutions for systems, applications, and data as well. Each customer has one or many VLANs in each data center facility, and only users and servers the customer authorizes can access servers in those VLANs.
  • SoftLayer offers single-tenant resources, so customers have complete control and transparency into their servers.

Q2: Does SoftLayer destroy my data when I’ve de-provisioned a compute resource?

A: Yes. When a customer cancels any physical or virtual server, all data is erased using Department of Defense (DoD) 5220.22-m standards.

Q3: How does SoftLayer protect my servers against distributed denial of service (DDoS) attacks?

A: A SoftLayer Network Operations Center (NOC) team monitors network performance and security 24x7. Automated DDoS mitigation controls are in place should a DDoS attack occur.

It’s important to clarify here that the primary objective of this DDoS mitigation is to maintain performance integrity of the overall cloud infrastructure. With that in mind, SoftLayer can’t stop a customer from being attacked, but it can shield the customer (and any other customers in the same network) from the effects of the attack. If necessary, SoftLayer will remove the target from the public network for periods of time and null-routes incoming connections. Because of SoftLayer’s three-tiered network architecture, a customer would still have access to the targeted system via the private network.

Q4: How is communication segmented from other tenants using SoftLayer?

A: SoftLayer utilizes industry standard VLANs and switch access control lists (ACLs) to segment customer environments. Customers have the ability to add and manage their own VLANs, providing additional security even inside their own accounts. ACLs are configured to permit or deny any specified network packet (data) to be directed along a switch.

Q5: How is my data kept private? How can I confirm that SoftLayer can’t read my confidential data?

A: This question is common customers who deal with sensitive workloads such as HIPAA-protected documentation, employee records, case files, and so on.

SoftLayer customers are encouraged to deploy a gateway device (e.g. Vyatta appliance) on which they can configure encryption protocols. Because the gateway device is the first hop into SoftLayer’s network, it provides an encrypted tunnel to traverse the VLANs that reside on SoftLayer. When securing compute and storage resources, customers can deploy single tenant dedicated storage devices to establish isolated workloads, and they can even encrypt their hard drives from the OS level to protect data at rest. Encrypting the hard drive helps safeguard data even if SoftLayer were to replace a drive or something similar.

Q6: Does SoftLayer track and log customer environments?

A: Yes. SoftLayer audits and tracks all user activity in our customer portal. Some examples of what is tracked include:

  • User access, both failed and authenticated attempts (destination IP is shown on a report)
  • Compute resources users deploy or cancel
  • APIs for each call (who called the API, the API call and function, etc.)
  • Intrusion Protection and Detection services that observe traffic to customer hosts
  • Additionally, customers have root access to operating systems on their servers, so they can implement additional logging of their own.

Q7: Can I disable access to some of my users through the customer portal?

A: Yes. SoftLayer has very granular ACLs. User entitlements are segmented into different categories, including Support, Security, and Hardware. SoftLayer also gives customers the ability to limit access to public and private networks. Customers can even limit user access to specific bare metal or virtual server.

Q8: Does SoftLayer patch my operating system?

A: For unmanaged cloud servers, no. Once the updated operating system is deployed on a customer’s server, SoftLayer doesn’t touch it.

If you want help with that hands-on server administration, SoftLayer offers managed hosting. In a managed hosting environment, Technical Account Managers (TAMs) are assigned as focal points for customer requests and issues. TAMs help with reports and trending data that provide recommendations to mitigate potential issues (including OS patching).

Q9: Is SoftLayer suited to run HIPAA workloads?

A: Yes. SoftLayer has a number of customers running HIPAA workloads on both bare metal and single-tenant virtual servers. A Business Associate Agreement (BAA), signed by SoftLayer and the customers, clearly define the shared responsibilities for data security: SoftLayer is solely responsible for the security of the physical data center, along with the SoftLayer-provided infrastructure.

Q10: Can SoftLayer run government workloads? Does SoftLayer use the FISMA standards?

A: The Federal Information Security Management Act (FISMA) defines a framework for managing information security that must be followed for all federal information systems. Some state institutions don’t require FISMA, but look to cloud hosting companies to be aligned to the FIMSA guidelines.

Today, two SoftLayer data centers are audited to the FISMA standards – Dallas (DAL05) and Washington, D.C. (WDC01). Customers looking for the FISMA standard can deploy their workloads in those data centers. Future plans include having data centers that comply with more stringent FedRAMP requests.

For additional information, I highly recommend the on-demand SoftLayer Fundamentals session, “Keep safe – securing your SoftLayer virtual instance.” Also, check out Allan Tate’s Thoughts on Cloud blog, “HIPAA and cloud computing: What you need to know” for more on how SoftLayer handles HIPPA-related workloads.

-Darrel Haswell

Darrel Haswell is a Worldwide Channel Solutions Architect for SoftLayer, an IBM Company.

April 23, 2014

Security: 10 Tips for Hardening a Linux Server

In light of all the complex and specialized attacks on Internet-facing servers, it’s very important to protect your cloud assets from malicious assailants whose sole purpose is to leach, alter, expose, siphon sensitive data, or even to shut you down. From someone who does a lot of Linux deployments, I like to have handy a Linux template with some extra security policies configured.

Securing your environment starts during the ordering process when you are deploying server resources. Sometimes you want to deploy a quick server without putting it behind an extra hardware firewall layer or deploying it with an APF (Advance Policy Firewall). Here are a couple of security hardening tips I have set on my Linux template to have a solid base level of security when I deploy a Linux system.

Note: The following instructions assume that you are using CentOS or Red Hat Enterprise Linux.

1. Change the Root Password
Log in to your server and change the root password if you didn’t use a SSH key to gain access to your Linux system.

  • passwd - Make sure it’s strong.
  • Don't intend on using root.

2. Create a New User
The root user is the only user created on a new Linux install. You should add a new user for your own access and use of the server.

  • useradd <username>
  • passwd <username> (Make sure this is a strong password that’s different from your root password.)

3. Change the Password Age Requirements
Change the password age so you’ll be forced to change your password in a given period of time:

  • chage –M 60 –m 7 –w 7 <username>
    • M: Minimum of days required between password changes
    • m: Maximum days the password is valid
    • w: The number of days before password will warn of expiration

4. Disable Root Login
As Lee suggested in the last blog, you should Stop Using Root!

  • When you need super-user permissions, use sudo instead of su. Sudo is more secure than using su: When a user uses sudo to execute root-level commands, all commands are tracked by default in /var/log/secure. Furthermore, users will have to authenticate themselves to run sudo commands for a short period of time.

5. Use Secure Shell (SSH)
rlogin and telnet protocols don’t use an encrypted format, just plain text. I recommend using SSH protocol for remote log in and file transfers. SSH allows you to use encryption technology while communicating with your sever. SSH is still open to many different types of attacks, though. I suggest using the following to lock SSH down a little bit more:

  • Remove the ability to SSH as root:
    1. vi /etc/ssh/sshd_config.
    2. Find #PermitRootLogin yes and change to PermitRootLogin no.
    3. Run service sshd restart.
  • Change the default SSH 22 port. You can even utilize RSA keys instead of passwords for extra protection.

6. Update Kernel and Software
Ensure your kernel and software patches are up to date. I like to make sure my Linux kernel and software are always up to date because patches are constantly being released with corrected security flaws and exploits. Remember you have access to SoftLayer’s private network for updates and patches, so you don’t have to expose your server to the public network to get updates. Run this with sudo to get updates in RedHat or CentOS: yum update.

7. Strip Your System
Clean your system of unwanted packages. I strip my system to avoid installing unnecessary software to avoid vulnerabilities. This is called “reducing the attack surface.” Packages like NFS, Samba, even the X Windows desktops (i.e., Gnome or KDE) contain vulnerabilities. Here’s how reduce the attack surface:

  • List what is installed: yum list installed
  • List the package name: yum list <package-name>
  • Remove the package: yum remove <package-name>

8. Use Security Extensions
Use a security extension such as SELinux on RHEL or CentOS when you’re able. SELinux provides a flexible Mandatory Access Control (MAC); running a MAC kernel protects the system from malicious or flawed applications that can damage or destroy the system. You’ll have to explore the official Red Hat documentation, which explains SELinux configuration. To check if SELinux is running, run sestatus.

9. Add a Welcome/Warning
Add a welcome or warning display for when users remote into your system. The message can be created using MOTD (message of the day). MOTD’s sole purpose is to display messages on console or SSH session logins. I like for my MOTDs to read “Welcome to <hostname>. All connections are being monitored and recorded.”

  • I recommend vi /etc/motd

10. Monitor Your Logs
Monitor logs whenever you can. Some example logs that you can audit:

  • System boot log: /var/log/boot.log
  • Authentication log: /var/log/secure
  • Log in records file: /var/log/utmp or /var/log/wtmp:
  • Where whole system logs or current activity are available: /var/log/message
  • Authentication logs: /var/log/auth.log
  • Kernel logs: /var/log/kern.log
  • Crond logs (cron job): /var/log/cron.log
  • Mail server logs: /var/log/maillog

You can even move these logs to a bare metal server to prevent intruders from easily modifying them.

This is just the tip of the iceberg when securing your Linux server. While not the most secure system, it gives you breathing room if you have to deploy quick servers for short duration tests, and so on. You can build more security into your server later for longer, more permanent-type servers.

- Darrel Haswell

Darrel Haswell is an advisory SoftLayer Business Partner Solution Architect.

Categories: 
December 16, 2013

Xplenty: Tech Partner Spotlight

We invite each of our featured SoftLayer Tech Marketplace Partners to contribute a guest post to the SoftLayer Blog, and this week, we're happy to welcome Yaniv Mor from Xplenty. Xplenty is a cloud-based code-free Hadoop as a Service platform that allows you to easily create data workflows, provision, monitor and scale clusters. Their goal is to eliminate the complexity of Hadoop to make it accessible and cost-effective for everyone.

Simplifying Hadoop

Apache Hadoop, open source software developed by Doug Cutting, is the most popular storage and processing platform for big data. Because Hadoop can accommodate structured data, semi-structured data, and unstructured data, it is the storage architecture of choice for some of the Internet's largest and most data-rich sites. Industry giants such as Google and Facebook have been using Hadoop for years to store and deliver information while gathering insights from customer behavior and internal business processes, and their obvious success with the platform has helped drive broad adoption and popularity all the way down to small-businesses and startups.

Specific use cases vary among industries, but similarities exist. Many companies leverage Hadoop to gather information about their clientele. With Hadoop, a company can process huge amounts of data to examine past and present behaviors, and with that information, customers can be presented personally-tailored recommendations, and the business can glean deep insights from the trends and outliers in its customer base. As a result, customers are more likely to make repeat purchases, and companies are able to predict trends and possible risks, allowing them to visualize and prepare for a number of business scenarios.

Another compelling use case for Hadoop is its ability to analyze and report on multi-faceted marketing and advertising campaigns. By drilling down into the guts of a campaign, users can see exactly what worked and what didn't. Marketers and advertisers can direct their resources to the campaigns that worked and let the ineffective ones fall by the wayside.

On the internal side, businesses are using Hadoop to better understand their own information. Data systems at financial companies use it to detect fraud anomalies by comparing transaction details. If you've ever made a credit card purchase in another state or country but the purchase didn't go through, your bank's system probably flagged the transaction for a representative to investigate. Other companies analyze data collected from their networks to monitor activity and diagnose bottlenecks and other issues with a negative impact.

The challenge with leveraging Hadoop's broad potential is that a company generally needs dedicated technical resources to allocate toward building and maintaining the solution — from manpower to financial to infrastructure. Hadoop is difficult to program and requires a very specific skill set that few possess. If a company doesn't have the personnel for the job, it will need to fork over some serious cash to get a system built and maintained. This can significantly hinder the progress of the data and business intelligence teams, and by default, the progress of the company. That's why we decided to create Xplenty.

Xplenty is a coding-free Hadoop-as-a-Service platform that allows data and BI users to process their big data stored on the SoftLayer cloud without having to acquire any special skills. What Xplenty does is remove the need to divert those precious resources from anything other than the business at hand. Xplenty's Hadoop-as-a-Service platform has a graphical user interface that enables the data and BI teams to build data flows without ever having to write a line of code. The benefit of this is twofold. First, the business intelligence analysts can quickly build data flows that would typically take weeks or more to program and debug, and data users can easily insert Xplenty into their data stack to handle processing needs. The second benefit is that since the IT department doesn't have to worry about doing any programming, they are able to tackle more pressing issues, bottlenecks are avoided, and life goes on without a hitch.

Xplenty was created specifically for the cloud, and SoftLayer is a major player in this space, so it was a natural fit for us to partner up to provide a SoftLayer-specific offering that will perform even better for customers already using SoftLayer infrastructure. We only work with providers with the best and most stable infrastructure, and SoftLayer is definitely at the top of the list.

If you want to try Hadoop on Xplenty, jump over to our SoftLayer sign up page, enter your details, and test drive the platform with a free 30-day trial!

- Yaniv Mor, Xplenty

This guest blog series highlights companies in SoftLayer's Technology Partners Marketplace.
These Partners have built their businesses on the SoftLayer Platform, and we're excited for them to tell their stories. New Partners will be added to the Marketplace each month, so stay tuned for many more come.
August 1, 2013

The "Unified Field Theory" of Storage

This guest blog was contributed by William Rocca of OS NEXUS. OS NEXUS makes the Quantastor Software Defined Storage platform designed to tackle the storage challenges facing cloud computing, Big Data and high performance applications.

Over the last decade, the creation and popularization of SAN/NAS systems simplified the management of storage into a single appliance so businesses could efficiently share, secure and manage data centrally. Fast forward about 10 years in storage innovation, and we're now rapidly changing from a world of proprietary hardware sold by big-iron vendors to open-source, scale-out storage technologies from software-only vendors that make use of commodity off-the-shelf hardware. Some of the new technologies are derivatives of traditional SAN/NAS with better scalability while others are completely new. Object storage technologies such as OpenStack SWIFT have created a foundation for whole new types of applications, and big data technologies like MongoDB, Riak and Hadoop go even further to blur the lines between storage and compute. These innovations provide a means for developing next-generation applications that can collect and analyze mountains of data. This is the exciting frontier of open storage today.

This frontier looks a lot like the "Wild West." With ad-hoc solutions that have great utility but are complex to setup and maintain, many users are effectively solving one-off problems, but these solutions are often narrowly defined and specifically designed for a particular application. The question everyone starts asking is, "Can't we just evolve to having one protocol ... one technology that unites them all?"

If each of these data storing technologies have unique advantages for specific use cases or applications, the answer isn't to eliminate protocols. To borrow a well-known concept from Physics, the solution lies in a "Unified Field Theory of Storage" — weaving them together into a cohesive software platform that makes them simple to deploy, maintain and operate.

When you look at the latest generation of storage technologies, you'll notice a common thread: They're all highly-available, scale-out, open-source and serve as a platform for next-generation applications. While SAN/NAS storage is still the bread-and-butter enterprise storage platform today (and will be for some time to come) these older protocols often don't measure up to the needs of applications being developed today. They run into problems storing, processing and gleaning value out of the mountains of data we're all producing.

Thinking about these challenges, how do we make these next-generation open storage technologies easy to manage and turn-key to deploy? What kind of platform could bring them all together? In short, "What does the 'Unified Field Theory of Storage' look like?"

These are the questions we've been trying to answer for the last few years at OS NEXUS, and the result of our efforts is the QuantaStor Software Defined Storage platform. In its first versions, we focused on building a flexible foundation supporting the traditional SAN/NAS protocols but with the launch of QuantaStor v3 this year, we introduced the first scale-out version of QuantaStor and integrated the first next-gen open storage technology, Gluster, into the platform. In June, we launched support of ZFS on Linux (ZoL), and enhanced the platform with a number of advanced enterprise features, such as snapshots, compression, deduplication and end-to-end checksums.

This is just the start, though. In our quest to solve the "Unified Field Theory of Storage," we're turning our eyes to integrating platforms like OpenStack SWIFT and Hadoop in QuantaStor v4 later this year, and as these high-power technologies are streamlined under a single platform, end users will have the ability to select the type(s) of storage that best fit a given application without having to learn (or unlearn) specific technologies.

The "Unified Field Theory of Storage" is emerging, and we hope to make it downloadable. Visit OSNEXUS.com to keep an eye on our progress. If you want to incorporate QuantaStor into your environment, check out SoftLayer's preconfigured QuantaStor Mass Storage Server solution.

-William Rocca, OS NEXUS

December 31, 2012

FatCloud: Tech Partner Spotlight

We invite each of our featured SoftLayer Tech Marketplace Partners to contribute a guest post to the SoftLayer Blog, and this week, we're happy to welcome Ian Miller, CEO of FatCloud. FatCloud is a cloud-enabled application platform that allows enterprises to build, deploy and manage next-generation .NET applications.

'The Cloud' and Agility

As the CEO of a cloud-enabled application platform for the .NET community, I get the same basic question all the time: "What is the cloud?" I'm a consumer of cloud services and a supplier of software that helps customers take advantage of the cloud, so my answer to that question has evolved over the years, and I've come to realize that the cloud is fundamentally about agility. The growth, evolution and adoption of cloud technology have been fueled by businesses that don't want to worry about infrastructure and need to pivot or scale quickly as their needs change.

Because FatCloud is a consumer of cloud infrastructure from Softlayer, we are much more nimble than we'd be if we had to worry about building data centers, provisioning hardware, patching software and doing all the other time-consuming tasks that are involved in managing a server farm. My team can focus on building innovative software with confidence that the infrastructure will be ready for us on-demand when we need it. That peace of mind also happens to be one of the biggest reasons developers turn to FatCloud ... They don't want to worry about configuring the fundamental components of the platform under their applications.

Fat Cloud

Our customers trust FatCloud's software platform to help them build and scale their .NET applications more efficiently. To do this, we provide a Core Foundation of .NET WCF services that effectively provides the "plumbing" for .NET cloud computing, and we offer premium features like a a distributed NoSQL database, work queue, file storage/management system, content caching and an easy-to-use administration tool that simplifies managing the cloud for our customers. FatCloud makes developing for hundreds of servers as easy as developing for one, and to prove it, we offer a free 3-node developer edition so that potential customers can see for themselves.

FatCloud Offering

The agility of the cloud has the clearest value for a company like ours. In one heavy-duty testing month, we needed 75 additional servers online, and after that testing was over, we needed the elasticity to scale that infrastructure back down. We're able to adjust our server footprint as we balance our computing needs and work within budget constraints. Ten years ago, that would have been overwhelmingly expensive (if not impossible). Today, we're able to do it economically and in real-time. SoftLayer is helping keep FatCloud agile, and FatCloud passes that agility on to our customers.

Companies developing custom software for the cloud, mobile or web using .NET want a reliable foundation to build from, and they want to be able to bring their applications to market faster. With FatCloud, those developers can complete their projects in about half the time it would take them if they were to develop conventionally, and that speed can be a huge competitive differentiator.

The expensive "scale up" approach of buying and upgrading powerful machines for something like SQL Server is out-of-date now. The new kid in town is the "scale out" approach of using low-cost servers to expand infrastructure horizontally. You'll never run into those "scale up" hardware limitations, and you can build a dynamic, scalable and elastic application much more economically. You can be agile.

If you have questions about how FatCloud and SoftLayer make cloud-enabled .NET development easier, send us an email: sales@fatcloud.com. Our team is always happy to share the easy (and free) steps you can take to start taking advantage of the agility the cloud provides.

-Ian Miller, CEO of FatCloud

This guest blog series highlights companies in SoftLayer's Technology Partners Marketplace. These partners have built their businesses on the SoftLayer Platform, and we're excited for them to tell their stories. New partners will be added to the Marketplace each month, so stay tuned for many more come.
Subscribe to guest-blog