Tuesday, May 31, 2011

13 More Tools for Testing the Performance of Your Blog

Body Builders train and "test" to the extreme. If we only did a fraction of that for our blogs...

Developing great blog content is as much the actual content as the way in which it is delivered. I’ve spoken multiple times about this but it’s worth repeating: The speed of your blog impacts your search engine ranking and your user experience!


In fact, I mention this briefly in the first post of the Blog Content Series as one of the often-overlooked elements of your content strategy!


In addition, I’ve already shared 5 sites that help you test a web host as well as 10 sites to test site speed, page load, and caching but I wanted to share my entire list of “blog performance” resources that I use to test a blog so that you have everything that I have when doing diagnostic tests and such.


So here are 13 more tools for testing your blog so that you can optimize your blog to the max! Be certain that some of these are a bit more “nerdy” than the one’s previously mentioned; in other words, you might need a bit more technical knowledge to use them well.


Oh, and yes, I did in fact just say “to the max”.



1. Fiddler Debugging Tool



Fiddler is a browser-based HTTP debugging tool which enables you to analyze both incoming and outgoing traffic to your blog.


2. Cuzillion



Cuzillion helps you analyze how pages on your blog and the components can interact with each other. If you’ve got custom templates for your blog this can be especially helpful. The author is the same guy who created YSlow!


3. Httpperf by HP



httperf is a tool for measuring web server performance.


4. IBM Page Detailer



IBM Page Detailer is a graphical tool that enables web content providers to rapidly and accurately measure client side performance of web pages. Neato.


5. Monitor Us



Mon.itor.us offers elegant, all-in-one web server monitoring as a service for sysadmins, webmasters, bloggers, individuals and small busness owners.


6. PushtoTest – Test Maker



PushToTest Test Maker is a free tool to test your blogs scalability and performance.


7. Pylot Performance Tool



Pylot is a free, open-source performance and testing tool for your blog.


8. Wbox HTTP Testing



WBOX is a free HTTP testing tool. Analyze your server’s file compression capabilities.


9. Web Page Analyzer



Web Page Analyzer is a super-simple web-based performance tool.


10. Site Perf



Another simple tool to test load speed, Site Perf simulates natural browsing behavior and provides you the results.


11. JMeter – Apache Testing



JMeter allows you to test an Apache web hosting setup.


12. The Grinder



The Grinder is a java load testing framework tool. It helps you create automated tests for you as well.


13. Open Web Load



Open Web Load is an open-source load testing web app.


Finally, as I mentioned in the beginning, here are two other posts that cover performance-related tools and testing that you might want to check out:



[This is part of the Developing Great Blog Content Series. Image from sgroi.]

13 More Tools for Testing the Performance of Your Blog

Body Builders train and "test" to the extreme. If we only did a fraction of that for our blogs...

Developing great blog content is as much the actual content as the way in which it is delivered. I’ve spoken multiple times about this but it’s worth repeating: The speed of your blog impacts your search engine ranking and your user experience!


In fact, I mention this briefly in the first post of the Blog Content Series as one of the often-overlooked elements of your content strategy!


In addition, I’ve already shared 5 sites that help you test a web host as well as 10 sites to test site speed, page load, and caching but I wanted to share my entire list of “blog performance” resources that I use to test a blog so that you have everything that I have when doing diagnostic tests and such.


So here are 13 more tools for testing your blog so that you can optimize your blog to the max! Be certain that some of these are a bit more “nerdy” than the one’s previously mentioned; in other words, you might need a bit more technical knowledge to use them well.


Oh, and yes, I did in fact just say “to the max”.



1. Fiddler Debugging Tool



Fiddler is a browser-based HTTP debugging tool which enables you to analyze both incoming and outgoing traffic to your blog.


2. Cuzillion



Cuzillion helps you analyze how pages on your blog and the components can interact with each other. If you’ve got custom templates for your blog this can be especially helpful. The author is the same guy who created YSlow!


3. Httpperf by HP



httperf is a tool for measuring web server performance.


4. IBM Page Detailer



IBM Page Detailer is a graphical tool that enables web content providers to rapidly and accurately measure client side performance of web pages. Neato.


5. Monitor Us



Mon.itor.us offers elegant, all-in-one web server monitoring as a service for sysadmins, webmasters, bloggers, individuals and small busness owners.


6. PushtoTest – Test Maker



PushToTest Test Maker is a free tool to test your blogs scalability and performance.


7. Pylot Performance Tool



Pylot is a free, open-source performance and testing tool for your blog.


8. Wbox HTTP Testing



WBOX is a free HTTP testing tool. Analyze your server’s file compression capabilities.


9. Web Page Analyzer



Web Page Analyzer is a super-simple web-based performance tool.


10. Site Perf



Another simple tool to test load speed, Site Perf simulates natural browsing behavior and provides you the results.


11. JMeter – Apache Testing



JMeter allows you to test an Apache web hosting setup.


12. The Grinder



The Grinder is a java load testing framework tool. It helps you create automated tests for you as well.


13. Open Web Load



Open Web Load is an open-source load testing web app.


Finally, as I mentioned in the beginning, here are two other posts that cover performance-related tools and testing that you might want to check out:



[This is part of the Developing Great Blog Content Series. Image from sgroi.]

Search marketing stats round up

Here's a selection of recent search stats, taken from a range of sources, including UK Search Engine Benchmarking report, the Internet Statistics Compendium, and others.


Topics covered include mobile search, SEO budgets, overall market size, Google's organic CTR, and user behaviour...

Mobile search (Performics and ROI)



  • According to a Performics study, people used mobile search at home in the evening (81%) followed by at home on weekends (80%), and at work (61%).

  • 66% use mobile search when watching TV, something which should get advertisers thinking, while 61% said they use it at work.

  • 71% use mobile search to find information about a product or service having seen an ad, 68% use it to find the best price for a product.

Mobile search advertising (Efficient Frontier)


  • Recent stats from Efficient Frontier found that, on average, mobile CTR is 2.7 times as much as desktop CTR, though this varies between sectors. It can be as much as five times desktop CTR.

  • Cost per click from mobile search advertising is lower, at 60% of desktop CPC on average.

  • According to our recent Search Engine Benchmarking report, the proportion of companies using mobile search doubled from 8% in last year's report to 16% this year.

User search behaviour (Performics)


  • 88% of respondents will click on a result that has the exact phrase they searched for, while 89% said that they will alter their search query if they don’t find the results they’re looking for.

  • 89% will ultimately change search engines if they don’t find the results they’re looking for. 79% will go through multiple pages of results, if their query isn’t answered in the first page

  • 53% of respondents said they're more likely to click on a listing if it includes and image, and 48% said that they click on a company or brand if they appear multiple times in the SERPs.

  • 26% said they were more likely to click on a search result if it included a video.

Multilingual search (Econsultancy / Guava UK Search Engine Marketing Benchmark Report 2011)


  • Only 9% of companies are currently using multilingual paid search or SEO campaigns. The majority of companies (75%) have no plans to run these types of campaigns.

  • The majority of companies (69%) have no plans to run multi-territory paid search or SEO campaigns through their agency, while only 13% are currently running multi-territory campaigns.


SEO budgets (Econsultancy)


  • 52% of companies carry out SEO entirely in-house, while 17% use an agency exclusively. Some 29% report they use both an agency and in-house resources for SEO.

  • On average, 22% of marketing budgets are spent on search engine marketing.

  • Just over a third of companies (35%) are spending £5,000 or less on SEO per year, while some 65% are spending more than £5,000 on SEO each year.


US search market size (Econsultancy / SEMPO State of Search Marketing Report 2011)



  • The value of the North American search market rose to $16.6bn in 2010, and is predicted to reach $19.3bn in 2011:




  • More than half of client-side respondents to the survey (54%) plan to spend more on SEO this year. On average, respondents expect to increase their SEO spend by 43%, which would bring overall SEO spending close to $3bn.

  • Companies are also expecting to spend 31% more on PPC, and while that is less than last year’s expected increase (37%), the overall trend is still upward.

UK search market size (Econsultancy SEO Agencies Buyer's Guide 2011)


  • During 2010, we estimated that the natural search marketing industry in the UK grew by 16%, reaching a value of £436m, up from £376m in 2009.

  • This represents approximately 12% of the value of the total UK search engine marketing sector last year, which Econsultancy estimates to have been worth £3.63bn. The paid search marketplace in 2010 has been valued at £3.19bn.





Google's organic CTR (Optify)



  • The average CTR for the top three positions in Google's SERPs are 36.4%, 12.5% and 9.5% respectively.

  • A listing above the fold on page one of Google produces an average CTR of 19.5% and being on page one produces an average of 8.9%.

  • The second page has value, but far less (a 1.5% CTR), although the first position on page two produces a slightly higher click through rate than the last position on page one.

Saturday, May 28, 2011

How To Succeed At Facebook Advertising

Gordmans opened two new stores in Minneapolis and promoted them with several Facebook campaigns, working with BlitzLocal.com.


They created two different types of campaigns: one advertising an event, and another advertising a tab. Both were targeted at the city level. Because the scope was so narrow, tests included adding the city name as part of the ad image itself. Overall though, these ads definitely helped in garnering more visitors.


Sponsored Stories Outperformed Regular Facebook Ads


There are two types of sponsored stories – a sponsored like, which targets friends of your fans, and a Sponsored post, which shows messages to existing fans. Gordmans ran a highly targeted sponsored like ad:



  • within the regions where the retailer has its 68 retail locations

  • female demographic

  • keywords related to bargain-hunting


While most Facebook ads are lucky to get a 0.05 percent clickthrough rate, this campaign drove a .4 percent CTR on the first day, which fell by 45 percent within 48 hours to .2 percent.


Generally, anything at or above 0.1 percent is highly optimized. Sponsored likes also cut the cost per click by 70 percent and cost per fan by 83 percent overall. That’s like getting a 77 percent discount off from Facebook.




In two days, this ad drove 515 clicks for $76 and gained 418 new fans. That works out to 18 cents per fan and a click-to-conversion rate of 81 percent.


Most brands out there are getting fans at between $2 and $10, the former via self-serve and the latter via premium ads. $0.18 for a new fan, one that is giving your brand permission to talk to them, is a great cost of acquisition.


Gordmans found the key to success with Facebook advertising is leveraging the endorsement of their existing fans. People are far more likely to click on events that are associated with what their friends are doing.


Highly Engaged Content Equals Positive Fan Growth


The creative refresh demand of social requires you to be able to iterate much quicker, to refresh your content and creative much more quickly than other types of online marketing. Gordmans knew they needed to rotate ads to keep them fresh. Facebook ads are typically served to the same users multiple times, often in the same day, so they quickly tune-out repeat ads.


Gordmans also used the Webtrends Apps platform to develop fresh and engaging applications rewarding customers for engaging through fans-only promotions.


While apps have about a 10-to-14 day shelf life before people start to drop off in interaction, ads have around three-to-five days before you see a dramatic drop off. But because Gordmans’ wall postings resonated well with the brand, only five percent of fans have unsubscribed from the page.




Geo-Targeting Works


The average human attention span is about 30 seconds. In fact, successful Facebook advertisers try to relate images to their audience, for example by serving an image of a local landmark or in Gordmans case including the city name is another way to garner more attention.




By injecting the city name in the ad image in conjunction with the geo-targeting, the ads were more appealing and relevant.


Gordmans found that geo-targeted ads with the city name on the ad image performed better than the ads without it. With geo-targeted ads that offered fans the opportunity to check in and claim deals, Gordmans was able to drive customers to their brick and mortar stores.


More Earned Media At A Cheaper Rate


By measuring the number of impressions the Facebook page generated over time, then estimating a $5 cost per impression, we can determine the earned media value of the brand.


Earned media represents impressions generated for free, from efforts outside of the traditional ad spend, which includes viral and word-of-mouth publicity such as likes and shares.



This type of exposure has a high quality because it leverages the trust of friends. With over 38 million impressions over a period of 79 days, at the aforementioned $5 CPM, we get $190,000 earned media value for that time period, which represents how much ad spend would have been required to achieve the same number of impressions via paid media.


Extended out over a year’s time, the value is $879,000 per year, or $4.5 million in perpetuity, assuming we’ve applied a 20 percent discount rate to the projection of earned media over time.


What’s Next?


Now in the works is a new Facebook places strategy— to drive check-ins, shares, and coupons.


Gordmans has had a lot of success in running Google Adwords campaigns with a focus on letting users redeem coupons. Running similar campaigns on Facebook will reinforce the Google campaigns, and with Facebook’s social twist tied into the coupon redemption strategy, they expect to see excellent results.


Veronica Stecker, is marketing and social media planner at Gordmans. Dennis Yu is co-founder and chief executive officer of BlitzLocal.com.

Tuesday, May 10, 2011

What Works on Tumblr


The biggest misconception when it comes to Tumblr is that it’s just another blog platform and good content is good content. Brands on Tumblr have yet to catch up to what “the kids” are doing. There are Tumblrs with tens of thousands of followers. People who sit by their computers just waiting to reblog the next thing posted. In a world where trends last for only a day top, the circulation of this content from within the communities is never-ending. Popular Tumblr accounts have a lot of things in common.


A lot of the content may seem very young, and it is. Over 37% of traffic comes from people ages 18-34. (Naturally there are geeky tumblrs too). This is the perfect platform for brands who might want to share quick, light hearted content that is easy to consume. Some small clothing brands are using Tumblr for a branding initiative. Other brands are using it as a way for loyal consumers to see a little closer into their world, creating a unique brand experience. We have seen many different ways that brands are using Tumblr for their business, but maybe the reason it doesn’t CLICK with users is because brands treat their Tumblr account like a blog.


Listen up brands, because here is what works on Tumblr.


1. Re-blogging Popular Content


A great way to find popular content and topics is by looking through the Explore section of Tumblr. This will show you where the popular kids hang out and how they got there. Taking content from these top users and reblogging it with a comment is sort of like drawing a mustache on your index finger…..Everyone is doing it. If you are a brand or just a user on Tumblr, it would be a goal to end up on this list.



2. Asking & Answering Questions on Followers Tumblogs


One of the biggest differences between Tumblr and blog platforms in general, is that Tumblr is a true social networking community. More often than not, the amount of followers one has is a direct result of how much effort and activity is going in. Having a growing list of people who you follow gives you an ample amount of content to reblog, and users to ask questions. Many users post their questions / answers live on their Tumblr. This exposes both the brand / user relationship. This can be a great idea to let the community know about excellent customer service. (This sounds awfully familiar to Twitter)


The reblogging process can be broken down into 4 Steps. Here is a lesson from popular Tumblr user, Rachel.



a. Search through your followers on your dashboard for the best content to be reblogged.


b. Reblog what is most relevant to you or your brand, with commentary of your own.


c. Watch the engagement between what you posted, your followers, and the original users followers.


d. Thank the community.


3. Participating in Tumblr trends




4. Using Popular Tags


Pay close attention to the tags underneath the posts of the people you follow. Many people tend to over use tags in their posts because they want to target a certain audience. The way to follow content that you might be interested in as a user is to type a tag into the top right corner and click “Track this tag“. Discovering what tags are popular and most commonly used will help to get content to show up on lists people are tracking. By using many tags, there is a greater chance of showing up on these lists and exposing your Tumblr to new sets of eyes.



While brands are slowly but surely catching on, the everyday Tumblr audience is not biting the bait. The immediate professional conclusion that comes to my mind, is that the brand has not fully integrated into a user on Tumblr. This reminds me of when brands on Twitter would not take the “persona” route and be very cautious of what they would tweet, when, how often, etc. As time went on, brands would soon develop themselves as personas on Twitter, giving users a way to interact and engage with the brand on the social network on a social level. It is totally normal for a brand to be cautious and stick its toe in the water to test the temperature, but it becomes inevitable to adjust to a social level in a social environment. More brands will engage and join Tumblr on a social level, and what works will work for brands and users alike.


Are you on Tumblr? Well then follow BlueGlass here!


* Top photo credit: http://www.flickr.com/photos/smoy/


"

Thursday, May 5, 2011

Fat Pandas and Thin Content

sad pandaIf you’ve been hit by the Panda update or are just worried about its implications, you’ve probably read a lot about “thin” content. We spend our whole lives trying to get thin, and now Google suddenly hates us for it. Is the Panda update an attempt to make us all look like Pandas? Does Google like a little junk in the trunk?

It’s confusing and it's frustrating, especially if you have real money on the line. It doesn’t help that “thin” content has come to mean a lot of things, and not every definition has the same solution. To try to unravel this mess, I'm going to present 7 specific definitions of “thin” content and what you can do to fatten them up.

Quality: A Machine’s View

To make matters worse, “thin” tends to get equated with “quality” – if you’ve got thin content, just increase your quality. It sounds good, on the surface, but ultimately Google’s view of quality is defined by algorithms. They can’t measure the persuasiveness of your copy or the manufacturing standards behind your products. So, I’m going to focus on what Google can measure, specifically, and how they might define “thin” content from a machine’s perspective.

1. True Duplicates (Internal)

True, internal duplicates are simply copies of your own pages that make it into the search index, almost always a results of multiple URLs that lead to the same content. In Google’s eyes, every URL is a unique entity, and every copy makes your content thinner:

internal duplicates

A few duplicates here and there won’t hurt you, and Google is able to filter them out, but when you reach the scale of an e-commerce site and have 100s or 1000s of duplicates, Google’s “let us handle it” mantra fails miserably, in my experience. Although duplicates alone aren’t what the Panda update was meant to address, these duplicates can exacerbate every other thin content issue.

The Solution

Get rid of them, plain and simple. True duplicates should be canonicalized, usually with a 301-redirect or the canonical tag. Paths to duplicate URLs may need to be cut, too. Telling Google that one URL is canonical only to link to 5 versions on your own site will only prolong your problems.

2. True Duplicates (Cross-site)

Google is becoming increasingly aggressive about cross-site duplicates, which may differ by their wrapper but are otherwise the exact same pieces of content across more than one domain:

cross-site duplicates

Too many people assume that this is all an issue of legitimacy or legality – scrapers are bad, but syndication and authorized duplication are fine. Unfortunately, the algorithm doesn’t really care. The same content across multiple sites is SERP noise, and Google will try to filter it out.

The Solution

Here’s where things start to get tougher. If you own all of the properties or control the syndication, then a cross-domain canonical tag is a good bet. Choose which version is the source, or Google may choose for you. If you’re being scraped and the scrapers are outranking you, you may have to build your authority or file a DMCA takedown. If you’re a scraper and Panda knocked you off the SERPs, then go Panda.

3. Near Duplicates (Internal)

Within your own site, “near” duplicates are just that – pages which vary by only a small amount of content, such as a couple of lines of text:

internal near duplicates

A common example is when you take a page of content and spin it off across 100s of cities or topics, changing up the header and a few strategic keywords. In the old days, the worst that could happen is that these pages would be ignored. Post-Panda, you risk much more severe consequences, especially if those pages make up a large percentage of your overall content.

Another common scenario is deep product pages that only vary by a small piece of information, such as the color of the product or the size. Take a T-shirt site, for example – any given style could come in dozens of combinations of gender, color, and size. These pages are completely legitimate, from a user perspective, but once they multiple into the 1000s, they may look like low-value content to Google.

The Solution

Unfortunately, this is a case where you might have to bite the bullet and block these pages (such as with META NOINDEX). For the second scenario, I think that can be a decent bet. You might be better off focusing your ranking power on one product page for the T-shirt instead of every single variation. In the geo-keyword example, it’s a bit tougher, since you built those pages specifically to rank. If you’re facing large-scale filtering or devaluation, though, blocking those pages is better than the alternative. You may want to focus on just the most valuable pages and prune those near duplicates down to a few dozen instead of a few thousand. Alternatively, you’ve got to find a way to add content value, beyond just a few swapped-out keywords.

4. Near Duplicates (Cross-site)

You can also have near duplicates across sites. A common example is a partnered reseller who taps into their customers’ databases to pull product descriptions. Add multiple partners, plus the original manufacturer’s site, and you end up with something like this:

cross-site near duplicates

While the sites differ in their wrappers and some of their secondary content, they all share the same core product description (in red). Unfortunately, it’s also probably the most important part of the page, and the manufacturer will naturally have a ranking advantage.

The Solution

There’s only one viable long-term solution here – if you want to rank, you’ve got to build out unique content to support the borrowed content. It doesn’t always take a lot, and there are creative ways to generate content cost-effectively (like user-generated content). Consider the product page below:

unique content illustration

The red text is the same, but here I’ve supplemented it with 2 unique bits of copy: (1) a brief editorial description, and (2) user reviews. Even a unique 1-2 sentence lead-off editorial that’s unique to your site can make a difference, and UGC is free (although it does take time to build).

Of course, the typical argument is “I don’t have the time or money to create that much unique content.” This isn’t something you have to do all at once – pick the top 5-10% of your best sellers and start there. Give your best products some unique content and see what happens.

5. Low Unique Ratio

This scenario is similar to internal near-duplicates (#3), but I’m separating it out because I find it manifests in a different way on a different set of sites. Instead of repeating body content, sites with a low ratio of unique content end up with too much structure and too little copy:

low unique content

This could be a result of excessive navigation, mega-footers, repeated images or dynamic content – essentially, anything that’s being used on every page that isn’t body copy.

The Solution

Like internal near-duplicates, you’ve got to buckle down and either beef up your unique content or consider culling some of these pages. If your pages are 95% structure with 1-2 sentences of unique information, you really have to ask yourself what value they provide.

6. High Ad Ratio

You’ve all seen this site, jam-packed with banners ads of all sizes and AdSense up and down both sides (and probably at the top and bottom):

too many ads

Of course, not coincidentally, you’ve also got a low amount of unique content in play, but Google can take an especially dim view of loading up on ads with nothing to back it up.

So, how much is too much? Last year, an affiliate marketer posted a very interesting conversation with an AdWords rep. Although this doesn’t technically reveal anything about the organic algorithm, it does tell us something about Google’s capabilities and standards. The rep claims that Google views a quality page as having at least 30% unique content, and it can only have as much space devoted to ads as it does to unique content. More importantly, it strongly suggests that Google can algorithmically measure both content ratio (#5) and ad ratio.

The Solution

You’ve got to scale back, or you’ve got to build up your content. Testing is very important here. Odds are good that, if your site is jammed with ads, some of those ads aren’t getting much attention. Collect the data, find out which ones, and cut them out. You might very well find that you not only improve your SEO, but you also improve the CTR on your remaining ads.

7. Search within Search

Most large (and even medium-sized) sites, especially e-commerce sites, have pages and pages of internal search results, many reachable by links (categories, alphabetical, tags, etc.):

search within search

Google has often taken a dim view of internal search results (sometimes called “search within search”, although that term has also been applied to Google’s direct internal search boxes). Essentially, they don’t want people to jump from their search results to yours – they want search users to reach specific, actionable information.

While Google certainly has their own self-interest in mind in some of these cases, it’s true that internal search can create tons of near duplicates, once you tie in filters, sorts, and pagination. It’s also arguable that these pages create a poor search experience for Google users.

The Solution

This can be a tricky situation. On the one hand, if you have clear conceptual duplicates, like search sorts, you should consider blocking or NOINDEXing them. Having the ascending and descending version of a search page in the Google index is almost always low value. Likewise, filters and tags can often create low-value paths to near duplicates.

Search pagination is a difficult issue and beyond the scope of this post, although I’m often in favor of NOINDEXing pages 2+ of search results. They tend to convert poorly and often look like duplicates.

A Few Words of Caution

Any change that would massively reduce your search index is something that has to be considered and implemented carefully. While I believe that thin content is an SEO disadvantage and that Google will continue to frown on it, I should also note that not all of these scenarios are necessarily reflected in the Panda update. These issues do reflect longer-standing Google biases and may exacerbate Panda-related problems.

Unfortunately, we’ve seen very few success stories of Panda recovery at this stage, but I strongly believe that addressing thin content, increasing uniqueness, and removing your lowest value pages from the index can have a very positive impact on SEO. I’d also bet good money that, while the Panda algorithm changes may be adjusted and fine-tuned, Google’s attitude toward thin content is here to stay. Better to address content problems now than find yourself caught up in the next major update.

Sad panda image licensed from iStockPhoto (©2010).


Do you like this post? Yes No



"

Disqus for ully's online marketing

Disqus for ully's online marketing