Posts Tagged 'Tool'

March 22, 2013

Social Media for Brands: Monitor Twitter Search via Email

If you're responsible for monitoring Twitter for conversations about your brand, you're faced with a challenge: You need to know what people are saying about your brand at all times AND you don't want to live your entire life in front of Twitter Search.

Over the years, a number of social media applications have been released specifically for brand managers and social media teams, but most of those applications (especially the free/inexpensive ones) differentiate themselves only by the quality of their analytics and how real-time their data is reported. If that's what you need, you have plenty of fantastic options. Those differentiators don't really help you if you want to take a more passive role in monitoring Twitter search ... You still have to log into the application to see your fancy dashboards with all of the information. Why can't the data come to you?

About three weeks ago, Hazzy stopped by my desk and asked if I'd help build a tool that uses the Twitter Search API to collect brand keywords mentions and send an email alert with those mentions in digest form every 30 minutes. The social media team had been using Twilert for these types of alerts since February 2012, but over the last few months, messages have been delayed due to issues connecting to Twitter search ... It seems that the service is so popular that it hits Twitter's limits on API calls. An email digest scheduled to be sent every thirty minutes ends up going out ten hours late, and ten hours is an eternity in social media time. We needed something a little more timely and reliable, so I got to work on a simple "Twitter Monitor" script to find all mentions of our keyword(s) on Twitter, email those results in a simple digest format, and repeat the process every 30 minutes when new mentions are found.

With Bear's Python-Twitter library on GitHub, connecting to the Twitter API is a breeze. Why did we use Bear's library in particular? Just look at his profile picture. Yeah ... 'nuff said. So with that Python wrapper to the Twitter API in place, I just had to figure out how to use the tools Twitter provided to get the job done. For the most part, the process was very clear, and Twitter actually made querying the search service much easier than we expected. The Search API finds all mentions of whatever string of characters you designate, so instead of creating an elaborate Boolean search for "SoftLayer OR #SoftLayer OR @SoftLayer ..." or any number of combinations of arbitrary strings, we could simply search for "SoftLayer" and have all of those results included. If you want to see only @ replies or hashtags, you can limit your search to those alone, but because "SoftLayer" isn't a word that gets thrown around much without referencing us, we wanted to see every instance. This is the code we ended up working with for the search functionality:

def status_by_search(search):
    statuses = api.GetSearch(term=search)
    results = filter(lambda x: x.id > get_log_value(), statuses)
    returns = []
    if len(results) > 0:
        for result in results:
            returns.append(format_status(result))
 
        new_tweets(results)
        return returns, len(returns)
    else:
        exit()

If you walk through the script, you'll notice that we want to return only unseen Tweets to our email recipients. Shortly after got the Twitter Monitor up and running, we noticed how easy it would be to get spammed with the same messages every time the script ran, so we had to filter our results accordingly. Twitter's API allows you to request tweets with a Tweet ID greater than one that you specify, however when I tried designating that "oldest" Tweet ID, we had mixed results ... Whether due to my ignorance or a fault in the implementation, we were getting fewer results than we should. Tweet IDs are unique and numerically sequential, so they can be relied upon as much as datetime (and far easier to boot), so I decided to use the highest Tweet ID from each batch of processed messages to filter the next set of results. The script stores that Tweet ID and uses a little bit of logic to determine which Tweets are newer than the last Tweet reported.

def new_tweets(results):
    if get_log_value() < max(result.id for result in results):
        set_log_value(max(result.id for result in results))
        return True
 
 
def get_log_value():
    with open('tweet.id', 'r') as f:
        return int(f.read())
 
 
def set_log_value(messageId):
    with open('tweet.id', 'w+') as f:
        f.write(str(messageId))

Once we culled out our new Tweets, we needed our script to email those results to our social media team. Luckily, we didn't have to reinvent the wheel here, and we added a few lines that enabled us to send an HTML-formatted email over any SMTP server. One of the downsides of the script is that login credentials for your SMTP server are stored in plaintext, so if you can come up with another alternative that adds a layer of security to those credentials (or lets you send with different kinds of credentials) we'd love for you to share it.

From that point, we could run the script manually from the server (or a laptop for that matter), and an email digest would be sent with new Tweets. Because we wanted to automate that process, I added a cron job that would run the script at the desired interval. As a bonus, if the script doesn't find any new Tweets since the last time it was run, it doesn't send an email, so you won't get spammed by "0 Results" messages overnight.

The script has been in action for a couple of weeks now, and it has gotten our social media team's seal of approval. We've added a few features here and there (like adding the number of Tweets in an email to the email's subject line), and I've enlisted the help of Kevin Landreth to clean up the code a little. Now, we're ready to share the SoftLayer Twitter Monitor script with the world via GitHub!

SoftLayer Twitter Monitor on GitHub

The script should work well right out of the box in any Python environment with the required libraries after a few simple configuration changes:

  • Get your Twitter Customer Secret, Access Token and Access Secret from https://dev.twitter.com/
  • Copy/paste that information where noted in the script.
  • Update your search term(s).
  • Enter your mailserver address and port.
  • Enter your email account credentials if you aren't working with an open relay.
  • Set the self.from_ and self.to values to your preference.
  • Ensure all of the Python requirements are met.
  • Configure a cron job to run the script your desired interval. For example, if you want to send emails every 10 minutes: */10 * * * * <path to python> <path to script> 2>&1 /dev/null

As soon as you add your information, you should be in business. You'll have an in-house Twitter Monitor that delivers a simple email digest of your new Twitter mentions at whatever interval you specify!

Like any good open source project, we want the community's feedback on how it can be improved or other features we could incorporate. This script uses the Search API, but we're also starting to play around with the Stream API and SoftLayer Message Queue to make some even cooler tools to automate brand monitoring on Twitter.

If you end up using the script and liking it, send SoftLayer a shout-out via Twitter and share it with your friends!

-@SoftLayerDevs

July 27, 2011

ClickTale: Tech Partner Spotlight

This is a guest blog from Shmuli Goldberg of ClickTale, an industry leader in customer experience analytics, providing businesses with revolutionary insights into their customers' online behavior.

Understanding the User Experience with In-Page Analytics

Since ClickTale's start back in 2006, we understood that engaging visitors on a website is the first step to increase conversions. Although traditional web analytics are great for delivering general statistics such as number of visitors or pages per visit, they leave a big black hole when it comes to understanding what happens inside the pages themselves.

ClickTale's In-Page Analytics feature set enables you to identify, observe, aggregate and analyze visitors' actual interaction inside your site, so you know exactly what page elements work, what to optimize and how to increase visitor engagement.

Our wide range of web optimization tools include Mouse Tracking, Heatmap Suite and Conversion Analytics solutions, but was our Visitor Recordings feature that started it all. Giving you a front row seat to your visitors' browsing sessions and delivering a thorough, in-depth view into what your visitors are focusing on and interacting with inside the pages themselves. All you need to do is grab the popcorn.

Our Heat maps are aggregated reports that visually display what parts of a webpage are looked at, clicked on, focused on and interacted with by your online visitors. See exactly what images, text and call to action buttons your visitors' think are hot and what's not!

Both these features allow you to instantly see how to go about optimizing your website instantly so you don't have to guess.

As a fully hosted subscription service, ClickTale is quick and easy to set up. We believe our wide range of heatmaps, behavioral analytics and full video playback make ClickTale the perfect way to round out your traditional web analytics suite. For more information, please visit www.clicktale.com.

- Shmuli Goldberg, ClickTale

This guest blog series highlights companies in SoftLayer's Technology Partners Marketplace.
These Partners have built their businesses on the SoftLayer Platform, and we're excited for them to tell their stories. New Partners will be added to the Marketplace each month, so stay tuned for many more come.
October 20, 2010

Happiness is a Warm Firmware Update

I thought this was pretty cool. SoftLayer has just launched a firmware upgrade tool to the customer portal. No more waiting for SoftLayer to upgrade your firmware, no more uncontrollable downtime when you don’t want it. The new upgrade tool places upgrade control firmly in the hands of customers, giving them the ability to march to their own drummer.

Simply click the relevant radio button, press update and the upgrade begins. If there is a problem, SoftLayer gets notified and we will replace any failed components to get a customer back on line. Done. How cool is that??

New Account

-@quigleymar

September 10, 2008

Help! My Server Blocked Me!

Ok, the title of this blog may sound funny but you would be surprised how many phone calls I get about that very subject. Sure it’s not that specific case every time, sometimes it’s a software issue, other times hardware. But in the end not being able to access your server is the worst feeling in the world.

Enter KVM over IP. (Also known as Keyboard-Video-Mouse)

Yes boys and girls, this wonderful feature provided on all mid to high-performance multi-core servers can be your best friend in a time of need. While on a routine support call, a customer of mine stated the server was blocking not only himself but a lot of his customers. I kept a level head and told him it was no problem. He paused for a moment then let me know just how big a deal it was, while he was explaining I promptly used the KVM to login to his server and shutdown the firewall. All of a sudden he stopped talking and said “It’s working!”, “What did you do?” I explained to him how KVM works just as if you were hooking up a console to your server, and can be used even if your public Ethernet cable is unplugged. I went on to show him where it was in his home portal and how all of this was given to him for free. Also I explained the issue had been fixed from my desk without ever having access to either the public or private ports on his server. The customer had never heard of such a feature and was amazed at how easy it was to use.

The beauty of KVM over IP is it removes the one thing many server owners dread, not being able to be in the data center when issues arise with their standard connection methods (RDP, SSH). With KVM over IP we are giving the customer a solution to that problem. Via KVM you can login to the management interface card, which in most cases resides on an entirely different network, and within seconds you will have access to your terminal as if you were standing right there in the datacenter!!! Not only can you connect to your server, you can manually power it on/off and also reboot your server all within the same management screen. Beyond server access you can monitor temperature readings as well as fan speeds in the server. The KVM card is a HUGE tool in any Softlayer customers’ toolbox and one that we in the Operations Team use often.

Here at Softlayer we are always thinking about how to make your business easier to run, whether it be implementing global services such as CDN, or making sure our customers have basic access to their server in the event of a crisis. Since starting my career here at Softlayer and learning of the KVM feature I’ve made it a point to inform the customer of the KVM interface along with all features that are offered to them (and believe me they never stop coming!) so be sure and check our announcements page because you never know what we will come out with next!

-Romeo

Subscribe to tool