Author Archive:

February 10, 2016

The Compliance Commons: Do you know our ISOs?

Editor’s note: This is the first of a three-part series designed to address general compliance topics and to answer frequently asked compliance questions.

How many times have you been asked by a customer if SoftLayer is ISO compliant?  Do you ever find yourself struggling for an immediate answer?  If so, you're not alone. 

ISO stands for International Organization for Standardization. The organization has published more than 19,000 international standards, covering almost all aspects of technology and business. If you have any questions about a specific ISO standard, you can search the ISO website. If you would like the full details of any ISO standard, an online copy of the standard can be purchased through their website. 

SoftLayer holds three ISO certifications, and we’re going after more. We offer industry standard best security practices relating to cloud infrastructure, including: 

ISO/IEC 27001: This certification covers the information security management process. It certifies that SoftLayer offers best security practices in the industry relating to cloud infrastructure as a service (IaaS). Going through this process and obtaining certification means that SoftLayer observes industry best practices in offering a safe and secure place to live in the cloud. It also means that our information security management practices adhere to strict, internationally recognized best practices.

ISO/IEC 27018: This certifies that SoftLayer follows the most stringent code of practice for protection of personally identifiable information (PII) in public clouds acting as PII processors. It establishes commonly accepted control objectives, controls, and guidelines for implementing measures to protect PII in accordance with the privacy principles in ISO/IEC 29100 for the public cloud computing environment. While not all of SoftLayer is public and while we have very distinct definitions for processing PII for customers, we decided to obtain the certification to solidify our security and privacy principles as robust.

ISO/IEC 27017: This is a code of practice for information security controls for cloud services.  It’s the global standard for cloud security practices—not only for what SoftLayer should do, but also for what our customers should do to protect information. SoftLayer’s ISO 27017 certification demonstrates our continued commitment to upholding the highest, most secure information security controls and applying them effectively and efficiently to our cloud infrastructure environment. The standard provides guidance in, but not limited to, the following areas:

  • Information Security
  • Human Resources
  • Asset Management
  • Access Control
  • Cryptography
  • Physical and Environmental Security
  • Operations Security
  • Communications Security
  • System Acquisition, Development & Maintenance
  • Supplier Relations
  • Incident Management
  • Business Continuity Management
  • Compliance
  • Network Security

How can SoftLayer’s ISO certification benefit me as a customer?

Customers can leverage SoftLayer’s certifications as long as it’s done in the proper manner. Customers cannot claim that they’re ISO certified just because they’re using SoftLayer infrastructure. That’s not how it works. SoftLayer’s ISO certifications may make it easier for customers to become certified because they can leverage our certification for the SoftLayer boundary. Our SOC2 report (available through our customer portal or sales team) describes our boundary in greater detail: the customers are not responsible for certifying what’s inside SoftLayer’s boundary.  

How does SoftLayer prove its ISO compliance?

SoftLayer’s ISO Certificates of Registration are publicly available on our website and on our third-party assessor’s website. By design, our ISO certificates denote that we conform to and meet all the applicable objectives of each standard. Since the ISO standards are steadfast and constant controls for everyone, we don’t offer our reports from the audits, but we can provide our certificates.

What SoftLayer data centers are applicable to the ISO certifications?

All of them! Each ISO certificate is applicable to every one of our data centers, in the U.S. and internationally. SoftLayer obtained ISO certifications on every one of our facilities because we operate with consistency across the globe. When a new SoftLayer data center comes online, there is some lag time between opening and certification because we need to be reviewed by our third-party assessor and have operational evidence available to support our data center certification. But as soon as we obtain the certifications, we’ll make them available.

Visit www.softlayer.com/compliance for a full list of our certifications and reports. They can also be found through the customer portal.

-Dana

 

February 5, 2016

Enable SSD caching on Bare Metal Server for 10X IOPS Improvements

Have you ever wondered how you could leverage the benefits of an SSD at the cost of cheap SATA hard drives?

SSDs provide extremely high IOPS for read and writes and are really tempting for creating volumes, which are IOPS centric. However, because SSD prices are significantly higher than SATA drives, IT managers are at a crossroad and must decide whether to go for SSDs and burn a fortune on them or stay with SATA drives.

But there is a way to use SATA drives and experience SSD performance using some intelligent caching techniques. If you have the right PCI RAID card installed on bare metal servers, you can leverage certain SSD caching feature benefits.

Make sure when configuring a bare metal server, which has sufficient drives bays (8+ the least), to have a LSI (AVAGO) MegaRAID card as the chosen RAID card. You can select the appropriate RAID configuration for OS and other workload data during the order process itself so that the RAIDs come preconfigured with them. As an additional resource for high speed cache device, consider ordering at least two or more SSDs. You can add this to your server even after deployment. These drives are the SSD caching drives that can be used to improve the overall performance of the cheap SATA drives from which one has carved out the volume. 

Install MSM for Easy Management of the RAID Card

Once the server is deployed, consider installing AVAGO MegaRAID Storage Manager (MSM) for the OS that has been installed in the server. (You can also perform a remote management of the RAID controller from a local machine by providing the IP of the server where the controller is installed).

Users can directly download MegaRAID Store Manager from the AVAGO website for the installed card in the machine. For the most popular MegaRAID SAS 9361-8i card download the MSM from the AVAGO website here.

How to Create CacheCade - SSD Caching Volumes and Attach to the Volume Drives

Follow these three steps to improve the IOPS on the existing Volumes on the bare metal server.

Step 1: Creating CacheCade Volumes

Once SSDs are deployed on bare metal servers and Regular Volumes are created, users can create a CacheCade volumes to perform SSD Caching. This can be easily achieved by right clicking AVAGO Controller and selecting the Create Cachecade – SSD Caching option.

Step 2: Choosing the right RAID Level and Write Policy for CacheCade Volumes

It is recommended to use a RAID 1 SSD Cache Cade Volume. This will eliminate a single point of failure at the SSD device level. This can be done by selecting available SSDs on the system and choosing RAID 1 as the RAID level. Click Add to add all available disks and Create Drive Group. Also, be sure to select Write Back as the Write Policy for increased IO performance for both Read and Writes to a Volume that needs to be cached. 

Step 3: Enabling SSD Caching For Volumes

If the Virtual Drives were created without SSD caching enabled, then this is the right time to enable them as shown below—selectively enable or disable set of Virtual drives which needs SSD caching.

Right click on the volume and select Enable SSD Caching.

Performance Comparison

We tried a simple comparison here on a 3.6TB RAID 50 (3 Drive with 2 Spans) volume with and without SSD caching using IOmeter tool (available here). The workload was a 50/50 (Read/Write) 4kb Pure Random IO workload subjected for about an hour on the volumes. 

Without SSD Caching – IOPS 970

With SSD Caching – IOPS 9000 (10X Improvement)

The result shows a 10X IOPS and workload dependent benefit. Results also show how repeatable the Read/Writes are happening with the same LBA.

This could certainly help a database application or IO centric workloads, which are hungry for IOPS, get an instant boost in performance. Try this today at Softlayer, and see the difference!!

-Subramanian 

 

February 3, 2016

Use TShark to see what traffic is passing through your gateway

Many of SoftLayer’s solutions make excellent use of the Brocade vRouter (Vyatta) dedicated security appliance. It’s a true network gateway, router, and firewall for your servers in a SoftLayer data center. It’s also an invaluable trouble-shooting tool should you have a connectivity issue or just want to take a gander at your network traffic. Built into vRouter’s command line and available to you, is a full-fledged terminal-based Wireshark command line implementation—TShark.

TShark is fully implemented in vRouter. If you’re already familiar with using TShark, you know you can call it from the terminal in either configuration or operational mode.  You accomplish this by prefacing a command with sudo; making the full command sudo tshark – flags.

For those of us less versed in the intricacies of Wireshark and its command line cousin, here are a couple of useful examples to help you out.

One common flag I use in nearly every capture is –i (and as a side note, for those coming from a Microsoft Windows background, the flags are case sensitive). -i is a specific interface on which to capture traffic and immediately helps to cut down on the amount of information unrelated to the problem at hand. If you don’t set this flag, the capture will default to “the first non-loopback address;” or in the case of vRouter on SoftLayer, Bond0. Additionally, if you want to trace a packet and reply, you can set –i any to watch or capture traffic through all the interfaces on the device.

The second flag that I nearly always use to define a capture filter is –f, which defines a filter to match traffic against. The only traffic that matches this pattern will be captured. The filter uses the standard Wireshark syntax. Again, if you’re familiar with Wireshark, you can go nuts; but here are a few of the common filters I frequently use to help you get started:

  • host 8.8.8.8 will match any traffic to or from the specified host. In this case, the venerable Google DNS servers. 
  • net 8.8.8.0/24 works just like host, but for the entire network specified, in case you don’t know the exact host address you are looking for.
  • dst and src are useful if you want to drill down to a specific flow or want to look at just the incoming or outgoing traffic. These filters are usually paired with a host or net to match against.
  • port lets you specify a port to capture traffic, like host and net. Used by itself, port will match both source and destination port. In the case of well-known services, you can also define the port by the common name, i.e., dns.  

One final cool trick with the –f filter is the and and the negation not. They let you combine search terms and specifically exclude traffic in order to create a very finely tuned capture for your needs.

If you want to capture to a file to share with a team or to plug into more advanced analysis tools on another system, the –w flag is your friend. Without -w, the file will behave like a tcpdump and the output will appear in your terminal session. If you want to load the file into Wireshark or another packet analyzer tool you should make sure to add the –F flag to specify the file format. Here is an example:

Vyatta# sudo tshark –i Bond0 –w testcap.pcap –F pcap –f ‘src 10.128.3.5 and not port 80’

The command will capture on Bond0 and output the capture to a .pcap file called testcap.pcap in the root directory of the file system. It will match only traffic on bond0 from 10.128.3.5 that is not source or destination port 22. While that is a bit of a mouthful to explain, it does capture a very well defined stream! 

Here is one more example:

Vyatta#sudo tshark –I any –f ‘host 10.145.23.4 and not ssh’

This command will capture traffic to the terminal that is to or from the specified IP (10.145.23.4) that is not SSH. I frequently use this filter, or one a lot like it, when I am SSHed into a host and want to get a more general idea of what it is doing on the network. I don’t care about ssh because I know the cause of that traffic (me!), but I want to know anything else that’s going to or from the host.

This is all very much the tip of the iceberg; you can find a lot more information at the TShark main page. Hopefully these tips help out next time you want to see just what traffic is passing through your gateway.

- Jeff 

 

February 2, 2016

The SLayer Standard Vol. 2, No. 4

The week in review. All the IBM Cloud and SoftLayer headlines in one place.

What does Marc Jones have to say about SoftLayer?

Our CTO Marc Jones sat down for an interview with Angel Diaz, IBM VP Cloud Technology & Architecture, host of IBM Cloud Dragon Dojo Series. Marc discusses his start at SoftLayer, the benefits of the SoftLayer cloud platform, dark fiber matter, and the importance of global reach. Instead of telling you what he said, you can watch it. 

Find a bit more about it here

IBM Watson business gets a new general manager.

IBM’s acquisition of the Weather Company is now complete, and that means a few changes are afoot. First, all of the Weather Company’s workloads are now running in IBM Cloud data centers. And second, David Kenny, who was the Weather Company CEO, is now in charge of Watson business.

In his new role, Kenny says his primary objective is to make Watson an even more robust platform and a leader in cognitive computing. In TechCrunch, he noted that the weather platform is not just about weather data. The massive amount of data that The Weather Channel takes in is used across various industries to help both companies and consumers make well-educated choices. All of this data will also be a boon to Watson as IBM continues to grow the AI platform with the Weather Company’s data sets.

“Obviously we ingest more weather data than others and process it in the cloud for pilots, insurers or farmers or ordinary citizens to make better informed decisions. But that platform can be reused for other unstructured data sets… this will be helpful for IBM in other business areas. What we have figured out at the Weather Company, and IBM will continue to explore across more IoT applications, is how to take data from lots of places and turn that into decisions to help make things work,” Kenny said.

Find out more about it here.

-Rachel  

Categories: 
January 29, 2016

Cloud, Interrupted: The Official SoftLayer Podcast, Episode 3

You’re never going to believe this. You already know the second episode of Cloud, Interrupted—the one, the only, the official SoftLayer podcast—hit the streets in December. And now, coming in hot, we’re bringing you the long-awaited third episode of Cloud, Interrupted—only a month after the last one! Contain your excitement. We’re getting good at this.

In the third episode of our authoritative, esteemed podcast, we discuss why our first podcasts were recorded in wind tunnels, we pat ourselves on the back for being doers and not scholars, and we reveal the humble, testosterone-fueled origins of the iconic Server Challenge.

Join Kevin Hazard, director of digital content, Phil Jackson, lead technology evangelist, and Teddy Vandenberg, manager of network provisioning, as they wreak havoc interrupting the world of cloud. Yet again.

You skipped that fluff-filled intro, didn’t you? We’ll reward your impatience with the CliffsNotes:

Cloud, Interrupted, Episode 3: In the end, you’ve gotta start somewhere.

  • [00:00:01] Yo yo yo, it’s the new and improved bleep bloops!
  • [00:00:25] We've finally stopped recording Cloud, Interrupted from our pillow forts. Now we just follow the mountains and valleys.
  • [00:04:23] So you want to host your own podcast? Cool. Take it from us on the ultimate, definitive, pretty-much-only guide to success: gear, software, and magical editing.
  • [00:06:24] Teddy takes us on a boring tangent about startups that’s not really a tangent at all. (You decide if it’s boring.)
  • [00:07:25] Ha ha, Kevin totally used to trick out his MySpace page.
  • [00:09:16] GOOD JOB, PHIL!
  • [00:09:26] Phil was THE most popular kid in school. That's how he started programming.
  • [00:13:40] There are two types of technical people: those that do and those that read the docs. Teddy doesn't read the docs. Ask him about YUM.
  • [00:17:59] C'mon, Kevin. No one wants to build a server at a conference for fun. What a dumb idea!

Oh Phil, Phil, Phil. Little did you know very how wrong you were. (Must’ve been the ponytail.)

- Fayza

January 27, 2016

Sales Primer for Non-Sales Startup Founders

The founder of one of the startups in our Global Entrepreneur Program reached out to me this week. He is ready to start selling his company’s product, but he's never done sales before.

Often, startups consist of a hacker and a hustler—where the tech person is the hacker and the non-tech person is the hustler. In the aforementioned company, there are three hackers. Despite the founder being deeply technical, he is the closest thing they have to a hustler. I'm sure he'll do fine getting in front of customers, but the fact remains that he's never done sales.

So where do you begin as a startup founder if you've never sold before?

Free vs. Paid
His business is B2B, focusing on car dealers. He's worried about facing a few problems, including working with business owners who don’t normally work with startups. He wants to give the product away for free to a few customers to get some momentum, but is worried that after giving it away, he won’t be able to convert them to paying customers.

Getting that first customer is incredibly important, but there needs to be a value exchange. Giving products away for free presents two challenges:

  1. By giving something away, you devalue your product in the eyes of the customer.
  2. The customer has no skin in the game—no incentive to use it or try to make it work.

Occasionally, founders have a very close relationship with a potential customer (e.g., a former manager or a trusted ex-colleague) where they can be assured the product will get used. In those cases, it might be appropriate to give it away, but only for a defined time.

The goal is sales. Paying customers reduce burn and show traction.

Price your product, go to market, and start conversations. Be willing to negotiate to get that first sale. If you do feel strongly about giving it away for free, put milestones and limitations in place for how and when that customer will convert to paid. For example, agree to a three-month free trial that becomes a paid fee in the fourth month. Or tie specific milestones to the payment, such as delivering new product features or achieving objectives for the client.

Build Credibility
When putting a new product in the market, especially one in an industry not enamored with startups and where phrases like “beta access” will net you funny looks, it helps to build credibility. This can be done incrementally. If you don't have customers, start with the conversations you’re having: “We’re currently in conversations with over a dozen companies.”

If you get asked about customers, don’t lie. Don’t even fudge it. I recommend being honest, and framing it by saying, “We’re deciding who we want to work with first. We want to find the right customer who is willing to work closely with us at the early stage. It’s the opportunity to have a deep impact on the future of the product. We're building this for you, after all.”

When you have interest and are in negotiations, you can then mention to other prospective customers that you’re in negotiations with several companies. Be respectful of the companies you’re in negotiations with; I wouldn't recommend mentioning names unless you have explicit permission to do so.

As you gain customers, get their permission to put them on your website. Get quotes from them about the product, and put those on your site and marketing materials. You can even put these in your sales contracts.

Following this method, you can build credibility in the market, show outside interest in your product, and maintain an ethical standing.

Get to No
A common phrase when I was first learning to sell was, “get to the ‘no’.” It has a double meaning: expect that someone is going to say “no” so be ready for it, and keep asking until you get a “no.” For example, if “Are you interested in my product?" gets you a “yes,” then ask, “Would you like to sign up today?”

When you get to no, the next step is to uncover why they said no. At this point, you’re not selling; you’re just trying to understand why the person you’re talking to is saying no. It could be they don't have the decision-making authority, they don't have the budget, they need to see more, or the product is missing something important. The point is, you don’t know, and your goal here is to get to the next step in their process. And you don’t know what that is unless you ask.

Interested in learning more? Dharmesh Shah, co-founder and CTO of Hubspot and creator of the community OnStartups, authored a post with 10 Ideas For Those Critical Early Startup Sales that is well worth reading.

As a founder, you’re the most passionate person about your business and therefore the most qualified to get out and sell. You don't have to be “salesy” to sell; you just need to get out and start conversations.

-Rich

January 25, 2016

The SLayer Standard Vol. 2, No. 3

The week in review. All the IBM Cloud and SoftLayer headlines in one place.

UStream joins the IBM family.
IBM has announced an exciting new addition to the family. We would like to welcome UStream to the team and a new cloud video services unit they will join. TechCrunch reported, “Braxton Jarratt, who came to IBM as part of the ClearLeap deal, has been chosen to run this new unit. He says UStream gives the company that missing streaming piece that allows them to form this unit with a full-service enterprise video offering.”

Jarrett also said that IBM “plans to incorporate other pieces like Watson for analytics, something that customers were asking for around video delivery at CES earlier this month. They want to know information like how long people are engaged and what kinds of actions they can take to stop churn.”

Get for information on the deal here.

IBM Watson is the future of artificial intelligence.
The head of IBM Watson, Mike Rhodin sat down for an interview with Forbes to talk about the future of artificial intelligence.

Since Watson’s appearance on Jeopardy!, it started a time that Rhodin considers “in-market experimentation.” During that time they worked with major names in the healthcare industry that “wanted to start to experiment with the technology–not to play Jeopardy!, but to use the underlying technology to start to solve problems.”

Rhodin noted, “The second thing that was a key decision about the launch of the commercial project was the creation of an open ecosystem: we would open up the APIs on platforms so that startups could get access to the technology and start to build out businesses on top of it.” This led to the beginning of the Watson Group made up of a few customers and a little group of startups who utilized the technology. That is when the ecosystem project took off.

Learn more about how Watson works and where it is going here.

-Rachel

Categories: 
January 22, 2016

Using Cyberduck to Access SoftLayer Object Storage

SoftLayer object storage provides a low cost option to store files in the cloud. There are three primary methods for managing files in SoftLayer object storage: via a web browser, using the object storage API, or using a third-party application. Here, we’ll focus on the third-party application method, demonstrating how to configure Cyberduck to perform file uploads and downloads. Cyberduck is a free and open source (GPL) software package that can be used to connect to FTP, SFTP, WebDAV, S3, or any OpenStack Swift-based object storage such as SoftLayer object storage.

Download and Install Cyberduck

You can download Cyberduck here, with clients for both Windows and Mac. After the installation is complete, download the profile for SoftLayer object storage here. Choose any of the download links under the Connecting section; preconfigured locations won’t matter as the settings will be modified later.

Once the profile has been downloaded, it needs to be modified to allow the hostname to be changed. Open the downloaded file (e.g. Softlayer (Amsterdam).cyberduckprofile) in a text editor. Locate the Hostname Configurable key (<key>Hostname Configurable</key>), and change the XML tag following that from <false/> to <true/>. Once this change has been made, there are two options to load the configuration file: Move the file to the profiles directory where Cyberduck is installed (on Windows this will be C:\Program Files (x86)\Cyberduck\profiles by default), or double-click on the profile, and Cyberduck will add the profile.

Configure Cyberduck to Work with SoftLayer

Now that Cyberduck has been installed, it needs to be configured to connect to object storage in SoftLayer. You can do this by creating a bookmark in Cyberduck. With Cyberduck open, click on Bookmark in the main menu bar, then New Bookmark in the dropdown menu.

In the dropdown box at the top of the Bookmark window, select SoftLayer Object Storage (Name of Location).

In the dropdown box at the top of the Bookmark window, select SoftLayer Object Storage (Name of Location). Depending on the profile that was downloaded, the location may be different. When the SoftLayer profile has been selected, the configurable options for that profile will be displayed. Enter a nickname that will identify the object storage location.

Next, depending on which data center will store the objects, the server option in Cyberduck may need to be changed. To find out which server should be specified, open a web browser and log into the SoftLayer portal. Once in the portal click on Storage then Object Storage. Select the object storage account that will be used for this connection.

If no accounts exist, a new object storage account can be ordered by using the Order Object
Storage link located in the upper right-hand corner. After selecting the account, select the data center where the object storage will reside.

When the Object Storage page loads, there will be a View Credentials link under the object storage container dropdown box in the upper left section of the screen.

Clicking on that link will bring up a dialog box that contains the information necessary for creating a connection in Cyberduck. Because SoftLayer has both public and private networks, there are two authentication endpoints available. The setup for each endpoint is the same, but a VPN connection to the SoftLayer private network is necessary in order to use the private endpoint.

Here, we will be using the public endpoints. Select the server address for the public endpoint (see the blue highlighted text) and enter it into the server text box in Cyberduck.

Next, select the username. It will be in the format:

object_storage_account_name:softlayer_user_name.

Then enter it into the Username text box. (Make note of the API Key, it will be used later.)

Once those options have been set (Nickname, Server, and Username), close the new bookmark window. In the main Cyberduck window, you should see the newly created bookmark listed. Double-click on it to connect to the SoftLayer object storage.

At this point, Cyberduck will prompt for the API key. Use the API key noted above and Cyberduck will connect to SoftLayer object storage. Uploading files can be accomplished by selecting the files and dragging them to the Cyberduck window. Downloading can be accomplished by selecting a file in Cyberduck and dragging it to the local folder where it will be downloaded.

-Bryan Bush

January 18, 2016

The SLayer Standard Vol. 2, No. 2

The week in review. All the IBM Cloud and SoftLayer headlines in one place.

Ford and IBM team up to take the hassle out of driving.
Ford announced a partnership with IBM Cloud to start a new platform to analyze transportation data. In an article by TechCrunch, “The new platform will use IBM’s cloud computing platform to analyze small slices of data to look for patterns and trends that could help drivers make better decisions about their driving—or whether they should maybe use another means of transportation.”

Ford began testing the platform to run its Dynamic Shuttle model on the Ford campus. Ford explains, “Should one of the Transit vans experience a malfunction that triggers a warning light, the platform will be able to start routing requests away from that vehicle to other Transits in service—allowing another shuttle to redeploy to keep all riders on schedule.”

Learn more about how Ford and IBM are helping drivers here.

Bluemix Social Sentiment App set to better fan experience at Australian Open.
The entire Australian Open 2016 experience will be hosted by IBM’s Continuous Available Services. In a blog post from IBM Bluemix Dev, “The component that provides a social endpoint, Social Sentiment Application, for fan experiences is hosted on a Bluemix hybrid cloud that follows several design principles: Cognitive Design, Microservices, High Availability, Parallel Functions and Disaster Avoidance.”

One highlight of the cognitive design is that it will allow for an engaging user experience, further developing the interactivity between people and machines. The post notes, “The system enables humans and machines to understand the crowd and their opinions focused around tennis players. Over time, the trend of tennis player sentiment is displayed through IBM’s SlamTracker, which learns player popularity movement. Humans interact with the Social Sentiment Application through Twitter, which has a direct impact on social sentiment.

Read more about the application’s design principles here.

IBM named a hybrid cloud leader by Forrester and Synergy.
Reports from both Forrester and Synergy Research highlighted IBM’s continued cloud growth in the hybrid arena. “These new reports further underscore the momentum IBM has gained among its customers that are increasingly turning to IBM for help connecting cloud services and applications to core systems that may always remain on-premises, due to such factors as regulatory compliance, control and cost.”

Forrester’s report studied many hybrid cloud solutions and noted, “Leaders such as IBM offer deep and broad support for pre-built application and infrastructure templates, powerful provisioning and configuration management, role-based controls, and rich cost, performance, and capacity management features.”

Learn more about Forrester’s and Synergy’s findings here.

-Rachel

Categories: 
January 15, 2016

Vuukle: Helping Publishers Manage Comments and Match Readers with Content

I recently had a conversation with Ravi Mittal, the founder of a company called Vuukle. Vuukle is based in New Delhi and has just graduated from our Catalyst startup program.

Vuukle actually started out in Silicon Valley—Ravi launched his first product iteration with the goal of trying to source public opinion on the Web. Key to his initial offering was a proprietary algorithm he developed to sort comments in order of credibility—a highly valuable aspect of the product, but something he quickly learnt wasn’t enough value to encompass a product.

Through experiments with Vuukle’s early customers (including the Santa Clara Weekly), a major problem emerged which appeared to pervade the online publishing industry: reader engagement wasn’t sticky enough to compel them to post (and reply to) comments. In order to solve this meta-problem, Vuukle pivoted into a new type of comment publishing system, which helps publishers see engagement through custom analytics.

The major problem Vuukle faces is not unique to just the publishers they service. It’s a pretty large scale global problem, extending beyond news publishers and into all content-based publishing online—so you can imagine how much competition is out there around the globe in this space. When I asked Ravi how he differentiates Vuukle from recently dominant players like Livefyre and Disqus, he offered, "Most customers aren’t using those other services; they have their own commenting systems. If anything, we were pitted against Facebook commenting. In the few cases where Disqus is being used, we’ve seen problems with load times, throttling limits and so on."

In order to set Vuukle in a class of its own, Ravi and his team—which is globally dispersed, with people in Egypt, the Ukraine, U.S.A., and India—have architected an infrastructure for super-fast load times that work at amazing scale, employing SoftLayer servers in our Singapore and India data centers, as well as working with a third party, ScaleDB, to handle database queries and traffic. Of course, that alone doesn’t give them a unique value proposition; Vuukle truly sets itself apart by dropping publisher costs upfront to a minimal platform access fee and offering a 50/50 revenue share model. Vuukle not only is set up to handle high traffic websites with commenting, but it also promotes user engagement with comments by integrating with actual publishing systems. Vuukle passes traffic between posts and offers editors insights into how readers are commenting, in addition to creating a new revenue stream through comments—from which it sources the majority of its own income.

Interestingly, Ravi’s move from the Valley to India came because of family reasons and ended up being a blessing to the business. Early after his move, he realized that there was a ton of opportunity for Vuukle with the major Indian newspapers that had cobbled together their own infrastructure to power websites. Just a couple years in, Vuukle is powering comments on The Hindu, Deccan Chronicle, and Indian Express, three of the most highly trafficked news websites in the country. To help global adoption amongst all sorts of publishers, Vuukle also offers a free WordPress plugin.

Vuukle seems to have gained traction through Ravi’s hard work chasing customers at home, and he’s proud to be finding success despite being bootstrapped. When questioned about the local startup scene, Ravi said, “Nothing much is unique in the Indian startup ecosystem. [It's] kind of like a gold rush in India, where founders are hunting for investment before they have a clear market path and products that are market-ready. A lot of copycat businesses [are] launching that are focused on Indian markets (taking models from the States and elsewhere.) Not many patents are being filed in India—not much actual innovation, indicative of a proliferation of large seed round raises (around $1 million) and a lot of startups spend funding on staff they don’t need.”

The future seems bright for Vuukle. Its growth beyond India’s borders will happen soon and will be financed through revenue rather than venture capital rounds, of which Ravi seems quite wary. Now that Vuukle has graduated from Catalyst, I was keen to hear whether the company would still keep the majority of their infrastructure with IBM—it turns out prospective Vuukle customers love hearing that their infrastructure is hosted on our cloud and that a core aspect of Vuukle’s value proposition is the scale and reliability we offer their solution.

I really think this company is an exciting one to watch. I look forward to seeing greater success for Vuukle as they grow with our ever-expanding footprint of data centers in the Asian region and globally.

-Qasim

Based in Toronto, Qasim Virjee manages the Catalyst Startup Program in Canada and can be reached on twitter (@qasim) or via his personal website.

Subscribe to Author Archive: %