Archived posts from the 'CTR' Category

Analyzing search engine rankings by human traffic

Recently I’ve discussed ranking checkers at several places, and I’m quite astonished that folks still see some value in ranking reports. Frankly, ranking reports are –in most cases– a useless waste of paper and/or disk space. That does not mean that SERP positions per keyword phrase aren’t interesting. They’re just useless without context, that is traffic data. Converting traffic pays the bills, not sole rankings. The truth is in your traffic data.

That said, I’d like to outline a method to get a particular useful information out of raw traffic data: underestimated search terms. That’s not a new attempt, and perhaps you have the reports already, but maybe you don’t look at the information which is somewhat hidden in stats ordered by success, not failure. And you should be –respective employ– a programmer to implement it.

The first step is gathering data. Create a database table to record all hits, then in a footer include or so, when the complete page got outputted already, write all data you have in that table. All data means URL, timestamp, and variables like referrer, user agent, IP, language and so on. Be a data rat, log everything you can get hold of. With dynamic sites it’s easy to add page title, (product) IDs etcetera, with static sites write a tool to capture these attributes separately.

For performance reasons it makes sense to work with a raw data table, which has just a primary key, to log the requests, and normalized working tables which have lots of indexes to allow aggregations, ad hoc queries, and fast reports from different perspectives. Also think of regular purging the raw log table and historization. While transferring raw log data to the working tables in low traffic hours or on another machine you can calculate interesting attributes and add data from other sources which were not available to the logging process.

You’ll need that traffic data collector anyway for a gazillion of purposes where your analytics software fails, is not precise enough, or just can’t deliver a particular evaluation perspective. It’s a prerequisite for the method discussed here, but don’t build a monster sized cannon to chase a fly. You can gather search engine referrer data from logfiles too.

For example an interesting information is on which SERP a user clicked a link pointing to your site. Simplified you need three attributes in your working tables to store this info: search engine, search term, and SERP number. You can extract these values from the HTTP_REFERER.

1. “google” in the server name tells you the search engine.
2. The “q” variable’s value tells you the search term “keyword1 keyword2″.
3. The lack of a “start” variable tells you that the result was placed on the first SERP. The lack of a “num” variable lets you assume that the user got 10 results per SERP, so it’s quite safe to say that you rank in the top 10 for this term. Actually, the number of results per page is not always extractable from the URL because it’s pulled from a cookie usually, but not so many surfers change their preferences (e.g. less than 0.5% surf with 100 results according to JohnMu and my data as well). If you’ve got a “num” value then add 1 and divide the result by 10 to make the data comparable. If that’s not precise enough you’ll spot it afterwards, and you can always recalculate SERP numbers from the canned referrer.

1. and 2. as above.
3. The “start” variable’s value 10 tells you that you got a hit from the second SERP. When start=10 and there is no “num” variable, most probably the searcher got 10 results per page.

1. and 2. as above.
3. The empty “startIndex” variable and startPage=1 are useless, but the lack of “start” and “num” tells you that you’ve got a hit from the 1st spanish SERP.

1. and 2. as above.
3. num=20 tells you that the searcher views 20 results per page, and start=20 indicates the second SERP, so you rank between #21 and #40, thus the (averaged) SERP# is 3.5 (provided SERP# is not an integer in your database).

You got the idea, here is a cheat sheet and official documentation on Google’s URL parameters. Analyze the URLs in your referrer logs and call them with cookies off what disables your personal search preferences, then play with the values. Do that with other search engines too.

Now a subset of your traffic data has a value in “search engine”. Aggregate tuples where search engine is not NULL, then select the results for example where SERP number is lower or equal 3.99 (respectively 4), ordered by SERP number ascending, hits descending and keyword phrase, break by search engine. (Why sorted by traffic descending? You have a report of your best performing keywords already.)

The result is a list of search terms you rank for on the first 4 SERPs, beginning with keywords you’ve probably not optimized for. At least you didn’t optimize the snippet to improve CTR, so your ranking doesn’t generate a reasonable amount of traffic. Before you study the report, throw away your site owner hat and try to think like a consumer. Sometimes those make use of a vocabulary you didn’t think of before.

Research promising keywords, and decide whether you want to push, bury or ignore them. Why bury? Well, in some cases you just don’t want to rank for a particular search term, [your product sucks] being just one example. If the ranking is fine, the search term smells somewhat lucrative, and just the snippet sucks in a particular search query’s context, enhance your SERP listing.

Every once in a while you’ll discover a search term making a killing for your competitors whilst you never spotted it because your stats package reports only the best 500 monthly referrers or so. Also, you’ll get the most out of your rankings by optimizing their SERP CTRs.

Be crative, over time your traffic database becomes more and more valuable, allowing other unconventional and/or site specific reports which off-the-shelf analytics software usually does not deliver. Most probably your competitors use standard analytics software, individually developed algos and reports can make a difference. That does not mean you should throw away your analytics software to reinvent the wheel. However, once you’re used to self developed analytic tools you’ll think of more interesting methods not only to analyse and monitor rankings by human traffic than you can implement in this century ;)

Bear in mind that the method outlined above does not and cannot replace serious keyword research.

Another –very popular– approach to get this info would be automated ranking checks mashed up with hits by keyword phrase. Unfortunately, Google and other engines do not permit automated queries for the purpose of ranking checks, and this method works with preselected keywords, that means you don’t find (all) search terms created by users. Even when you compile your ranking checker’s keyword lists via various keyword research tools, you’ll still miss out on some interesting keywords in your seed list.

Related thoughts: Why regular and automated ranking checks are necessary when you operate seasonal sites by Donna

Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments

Google assists SERP Click-Through Optimization

Big Mama Google in her ongoing campaign to keep her search index clean assists Webmasters with reports allowing click-trough optimization of a dozen or so pages per Web site. Google launched these reports a while ago, but most Webmasters didn’t make the best use of them. Now that Vanessa has revealed her SEO secrets, lets discuss why and how Google helps increasing, improving, and targeting search engine traffic.

Google is not interested in gazillions of pages which rank high for (obscure) search terms but don’t get clicked from the SERPs. This clutter tortures the crawler and indexer, and it wastes expensive resources the query engine could use to deliver better results to the searchers.

Unfortunately, legions of clueless SEOs work hard to increase mount clutter by providing their clients with weekly ranking reports, what leads to even more pages which rank for (potentially money making) search phrases but appear on the SERPs with such crappy titles and snippets that not even a searcher coming with an IQ slightly below a slice of bread clicks them.

High rankings don’t pay the bills, converting traffic from SERPs on the other hand does. A nicely ranking page is an asset, which in most cases just needs a few minor tweaks to attract search engine users (mount clutter contains machine generated cookie-cutter pages too, but that’s a completely other story).

For example unattended pages gaining their SERP position from anchor text of links pointing to them often have a crappy click through rate (CTR). Say you’ve a page about a particular aspect of green widgets, which applies to widgets of all colors. For some reason folks preferring red widgets like your piece and link to it with “red widgets” as anchor text. The page will rank fine for [red widgets], but since “red widgets” is not mentioned on the page this keyword phrase doesn’t appear on the SERP’s snippets, not to speak of the linked title. Search engine users seeking for information on red widgets don’t click the link about green widgets, although it might be the best matching search result.

So here is the click-thru optimization process based on Google’s query stats (it doesn’t work with brand new sites nor more or less unindexed sites, because the data provided in Google’s Webmaster Tools are available, reliable and quite accurate for somewhat established sites only):

Login, choose a site and go to query stats. In an ideal world you’ll see two tables of rather identical keyword lists (all examples made up).

Top search queries Avg.
Top SERP clicks Avg.
1. web site design 5 1. web site design 4
2. google consulting 4 2. seo consulting 5
3. seo consulting 3 3. google consulting 2
4. web site structures 2 4. internal links 3
5. internal linkage 1 5. web site structure 3
6. crawlability 3 6. crawlability 5

The “Top search queries” table on the left shows positions for search phrases on the SERPs, regardless whether these pages got clicks or not. The “Top search query clicks” table on the right shows which search terms got clicked most, and where the landing pages were positioned on their SERPs. If good keywords appear in the left table but not in the right one, you’ve CTR optimization potentials.

The “average top position” might differ from todays SERPs, and it might differ for particular keywords even if those appear in the same line in both tables. Positioning fluctuation depends on a couple of factors. First, the position is recorded at the run time of each search query during the last 7 days, and within seven days a page can jump up and down on the SERPs. Second, positioning on for example UK SERPs can differ from US SERPs, so an average 3rd position may be a utterly useless value, when a page ranks #1 in the UK and gets a fair amount of traffic from UK SERPs, but ranks #8 on US SERPs and searchers don’t click it because the page is about a local event near Loch Nowhere in the highlands. Hence refine the reports by selecting your target markets in “location”, and if necessary “search type” too. Third, if these stats are generated based on very few searches and even fewer click throughs, they are totally and utterly useless for optimization purposes.

Lets say you’ve got a site with a fair amount of Google search engine traffic, the next step is identifying the landing pages involved (you get only 20 search queries, so the report covers only a fraction of your site’s pages). Pull these data from your referrer stats, or extract SERP referrers from your logs to create a crosstab of search terms from Google’s reports per landing page. Although the click data are from Google’s SERPs, it might make sense to do this job with a broader scope, that is including referrers from all major search engines.

Now perform the searches for your 20 keyword phrases (just click on the keywords on the report) to check how your pages look at the SERPs. If particular landing pages trigger search results for more than one search term, extract them all. Then load your landing page, and view its source. Read your page first rendered in your browser, then check out semantic hints in the source code, for example ALT or TITLE text and stuff like that. Look at the anchor text of incoming links (you can use link stats and anchor text stats from Google, We Build Pages Tools, …) and other ranking factors to understand why Google thinks this page is a good match for the search term. For each page, let the information sink before you change anything.

If the page is not exactly a traffic generator for other targeted keywords, you can optimize it with regard to a better CTR for the keyword(s) it ranks for. Basically that means use the keyword(s) naturally on all page areas where it makes sense, and provide each occurence with a context which hopefully makes it into the SERP’s snippet.

Make up a few natural sentences a searcher might have in mind when searching for your keyword(s). Write them down. Order them by their ability to fit the current page text in a natural way. Bear in mind that with personalized search Google could have scanned the searcher’s brain to add different contexts to the search query, so don’t concentrate too much on the keyword phrase alone, but on short sentences containing both the keyword(s), respectively their synonyms, and a sensible context as well.

There is no magic number like “use the keywords 5 times to get a #3 spot” or “7 occurences of a keyword gain you a #1 ranking”. Optimal keyword density is a myth, so just apply common sense by not annoying human readers. One readable sentence containing the keyword(s) might suffice. Also, emphasizing keywords (EM/I, STRONG/B, eye catching colors …) makes sense because it helps catching the attention of scanning visitors, but don’t over-emphasize because that looks crappy. The same goes for H2/H3/… headings. Structure your copy, but don’t write in headlines. When you emphasize a word or phrase in (bold) red, then don’t do that consistently but only in the most important sentence(s) of your page, and better only on the first visible screen of a longer page.

Work in your keyword+context laden sentences, but -again!- do it in a natural way. You’re writing for humans, not for algos which at this point already know what your page is all about and rank it properly. If your fine tuning gains you a better ranking that’s fine, but the goal is catching the attention of searchers reading (in most cases just skimming) your page title and a machine generated snippet on a search result page. Convince the algo to use your inserted sentence(s) in the snippet, not keyword lists from navigation elements or so.

Write a sensible summary of the page’s content, not more than 200-250 characters, and put that into the description meta tag. Do not copy the first paragraph or other text from the page. Write the summary from scratch instead, and mention the targeted keyword(s). The first paragraph on the page can exceed the length of the meta description to deliver an overview of the page’s message, and it should provide the same information, preferably in the first sentence, but don’t make it longish.

Check the TITLE tag in HEAD: when it is truncated on the SERP then shorten it so that the keyword becomes visible, perhaps move the keyword(s) to the beginning, or create a neat page title around the keyword(s). Do title changes very carefully, because the title is an important ranking factor and your changes could result in a ranking drop. Some CMSs change the URL without notice on changes of the title text, and you certainly don’t want to touch the URL at this point.

Make sure that the page title appears on the page too. Putting the TITLE tag’s content (or a slight variation) in a H1 element in BODY cannot hurt. If you for some weird reasons don’t use H-elements, then at least format it prominently (bold, different color but not red, bigger font size …).

If the page performs nice with a couple money terms and just has a crappy CTR for a particular keyword it ranks for, you can just add a link pointing to a (new) page optimized for that keyword(s), with the keyword(s) in the anchor text, preferably embedded in a readable sentence within the content (long enough to fill two lines under the linked title on the SERP), to improve the snippet. Adding a (prominent) link to a related topic should not impact rankings for other keywords too much, but the keywords submitted by searchers should appear in the snippet a short while after the next crawl. In such cases better don’t change the title, at least not now. If the page gained its ranking solely from anchor text of inbound links, putting the search term on the page can give it a nice boost.

Make sure you get an alert when Ms. Googlebot fetches the changed pages, and check out the SERPs and Google’s click stats a few days later. After a while you’ll get a pretty good idea of how Google creates snippets, and which snippets perform best on the SERPs. Repeat until success.

Related posts:
Google Quality Scores for Natural Search Optimization by Chris Silver Smith
Improve SERP-snippets by providing a good meta description tag by Raj Krishnan from Google’s Snippets Team

Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments