Cloaking is good for you. Just ignore Bing’s/Google’s guidelines.

Summary first: If you feel the need to cloak, just do it within reason. Don’t cloak because you can, but because it’s technically the most elegant procedure to accomplish a Web development task. Bing and Google can’t detect your (in no way deceptive) intend algorithmically. Don’t spam away, though, because you might leave trails besides cloaking alone, if you aren’t good enough at spamming search engines. Keep your users interests in mind. Don’t comply to search engine guidelines as set in stone, but to a reasonable level, for example when those force you to comply to Web standards that make more sense than the fancy idea you’ve developed on internationalization, based on detecting browser language settings or so.

search engine guidelines are bullshit WRT cloakingThis pamphlet is an opinion piece. The above said should be considered best practice, even by search engines. Of course it’s not, because search engines can and do fail, just like a webmaster who takes my statement “go cloak away if it makes sense” as technical advice and gets his search engine visibility tanked the hard way.

WTF is cloaking?

Cloaking, also known as IP delivery, means delivering content tailored for specific users who are identified primarily by their IP addresses, but also by user agent (browser, crawler, screen reader…) names, and whatnot. Here’s a simple demonstration of this technique. The content of the next paragraph differs depending on the user requesting this page. Googlebot, Googlers, as well as Matt Cutts at work, will read a personalized message:

Dear visitor, thanks for your visit from 54.161.145.251 (ec2-54-161-145-251.compute-1.amazonaws.com).

You surely can imagine that cloaking opens a can of worms lots of opportunities to enhance a user’s surfing experience, besides “stalking” particular users like Google’s head of WebSpam.

Why do search engines dislike cloaking?

Apparently they don’t. They use IP delivery themselves. When you’re traveling in europe, you’ll get hints like “go to Google.fr” or “go to Google.at” all the time. That’s google.com checking where you are, trying to lure you into their regional services.

More seriously, there’s a so-called “dark side of cloaking”. Say you’re a seasoned Internet marketer, then you could show Googlebot an educational page with compelling content under an URI like “/games/poker” with an X-Robots-Tag HTTP header telling “noarchive”, whilst surfers (search engine users) supplying an HTTP_REFERER and not coming from employee.google.com get redirected to poker dot com (simplified example).

That’s hard to detect for Google’s WebSpam team. Because they don’t do evil themselves, they can’t officially operate sneaky bots that use for example AOL as their ISP to compare your spider fodder to pages/redirects served to actual users.

Bing sends out spam bots that request your pages “as a surfer” in order to discover deceptive cloaking. Of course those bots can be identified, so professional spammers serve them their spider fodder. Besides burning the bandwidth of non-cloaking sites, Bing doesn’t accomplish anything useful in terms of search quality.

Because search engines can’t detect cloaking properly, not to speak of a cloaking webmaster’s intentions, they’ve launched webmaster guidelines (FUD) that forbid cloaking at all. All Google/Bing reps tell you that cloaking is an evil black hat tactic that will get your site penalized or even banned. By the way, the same goes for perfectly legit “hidden content” that’s invisible on page load, but viewable after a mouse click on a “learn more” widget/link or so.

Bullshit.

If your competitor makes creative use of IP delivery to enhance their visitors’ surfing experience, you can file a spam report for cloaking and Google/Bing will ban the site eventually. Just because cloaking can be used with deceptive intent. And yes, it works this way. See below.

Actually, those spam reports trigger a review by a human, so maybe your competitor gets away with it. But search engines also use spam reports to develop spam filters that penalize crawled pages totally automatted. Such filters can fail, and –trust me– they do fail often. Once you must optimize your content delivery for particular users or user groups yourself, such a filter could tank your very own stuff by accident. So don’t snitch on your competitors, because tomorrow they’ll return the favor.

Enforcing a “do not cloak” policy is evil

At least Google’s WebSpam team comes with cojones. They’ve even banned their very own help pages for “cloaking“, although those didn’t serve porn to minors searching for SpongeBob images with safe-search=on.

That’s overdrawn, because the help files of any Google product aren’t usable without a search facility. When I click “help” in any Google service like AdWords, I get either blank pages, and/or links within the help system are broken because the destination pages were deindexed for cloaking. Plain evil, and counter productive.

Just because Google’s help software doesn’t show ads and related links to Googlebot, those pages aren’t guilty of deceptive cloaking. Ms Googlebot won’t pull the plastic, so it makes no sense to serve her advertisements. Related links are context sensitive just like ads, so it makes no sense to persist them in Google’s crawling cache, or even in Google’s search index. Also, as a user I really don’t care whether Google has crawled the same heading I see on a help page or not, as long as I get directed to relevant content, that is a paragraph or more that answers my question.

When a search engine doesn’t deliver the very best search results intentionally, just because those pages violate an outdated and utterly useless policy that rules fraudulent tactics in a shape lastly used in the last century and doesn’t take into account how the Internet works today, I’m pissed.

Maybe that’s not bad at all when applied to Google products? Bullshit, again. The same happens to any other website that doesn’t fit Google’s weird idea of “serving the same content to users and crawlers”. I mean, as long as Google’s crawlers come from US IPs only, how can a US based webmaster serve the same content in German language to a user coming from Austria and Googlebot, both requesting a URI like “/shipping-costs?lang=de” that has to be different for each user because shipping a parcel to Germany costs $30.00 and a parcel of the same weight shipped to Vienna costs $40.00? Don’t tell me bothering a user with shipping fees for all regions in CH/AT/DE all on one page is a good idea, when I can reduce the information overflow to a tailored info of just one shipping fee that my user expects to see, followed by a link to a page that lists shipping costs for all european countries, or all countries where at least some folks might speak/understand German.

Back to Google’s ban of its very own help pages that hid AdSense code from Googlebot. Of course Google wants to see what surfers see in order to deliver relevant search results, and that might include advertisements. However, surrounding ads don’t necessarily obfuscate the page’s content. Ads served instead of content do. So when Google wants to detect ad laden thin pages, they need to become smarter. Penalizing pages that don’t show ads to search engine crawlers is a bad idea for a search engine, because not showing ads to crawlers is a good idea, not only bandwidth-wise, for a webmaster.

Managing this dichotomy is the search engine’s job. They shouldn’t expect webmasters to help them solving their very own problems (maintaining search quality). In fact, bothering webmasters with policies solely put because search engine algos are fallible and incapable is plain evil. The same applies to instruments like rel-nofollow (launched to help Google devaluing spammy links but backfiring enormously) or Google’s war on paid links (as if not each and every link on the whole Internet is paid/bartered for, somehow).

What do you think, should search engines ditch their way too restrictive “don’t cloak” policies? Click to vote: Stop search engines that tyrannize webmasters!

 

Update 2010-07-06: Don’t miss out on Danny Sullivan’s “Google be fair!” appeal, posted today: Why Google Should Ban Its Own Help Pages — But Also Shouldn’t



Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments
 

13 Comments to "Cloaking is good for you. Just ignore Bing's/Google's guidelines."

  1. Ms Googlebot on 6 July, 2010  #link

    Thanks for laying out milk and cookies.

  2. […] Cloaking is good for you. Just ignore Bing’s/Google’s guidelines., sebastians-pamphlets.com […]

  3. Matt Inertia on 6 July, 2010  #link

    Seems to me that cloaking or “ip delivery” is a necessary process to (when you completely disregard what Google says you should do) employ. Why the hell wouldn’t I want to deliver different content to different people? They are, after all, different people! Fair enough, delivering googlebot a page with 3000 words of keyword orientated nonsensical text when your normal page contains a paragraph of sales text and 6 product images is a blatant spam technique but as you suggest above there are practical applications from IP delivery.

    So whats the difference between IP delivery and geo targeting?

    Looks like IP delivery/cloaking is about to be the next big topic in the SEO bar.

  4. Mike on 6 July, 2010  #link

    I can see where cloaking can be used for bad and negative reasons, but there must be some way of differentiating this from the good type, after all like you said Google use it themselves.

  5. Johan on 6 July, 2010  #link

    I think you’re taking this a bit to short,

    Of course you are right about how google should allow some cloaking techniques, like in the examples you mentioned.

    But serving a completely different page to a search engine as opposed to a regular user agent, for the sole reason of better ranking is just plain wrong and search engines should delete websites who do such things.

    Besides, changing a bit of content does not trigger the ‘big red button’. it has to be completely different, poor programmers might trigger this, but changing a few prices or a few sentences wouldn’t, at least, that is my expierence

  6. mohsin on 6 July, 2010  #link

    Most of the webmasters understand that Search Engine policies are meant to limit the choices of content serving, in the name of spam control. And most of the web-spam is in fact due to these Search Engines themselves. I think cloaking will be a legitimate technique if there were no search engines ;) Because people use cloaking to dodge Search Engines!
    Similarly it is due to Search Engines’ crappy policies that people no more link to others freely.

  7. Sebastian on 7 July, 2010  #link

    Matt, IP delivery (cloaking) is a method, geo targeting is a purpose.

    Mike, Google says IP delivery is fine with us, just don’t cloak. The problem is, that they rule methods and tactics because they fail on judging intentions, or purposes, properly. In other words, since distinguishing spammy purposes from well meant uses of cloaking algorithmically deemed impossible, they process hard rules that tyrannize webmasters, and circumcize creativity.

    Johan, if a search engine would try to delete a website of mine, I’d call 911. That’s plain theft. ;-)
    Mohsin, you’re spot on.

  8. Jim Watson on 7 July, 2010  #link

    Excellent article Sebastian - bang on the money.

  9. […] Cloaking is good for you. Just ignore Bing’s/Google’s guidelines. - lol… and what is a week on the Trails without a good rant? Weak I say… thus it was good to see Sebastian taking a great little kick at the can towards the world of legitimate and illegitimate cloaking. This is an area I wish search engines would talk about more for clarity. Nice on Seb!! […]

  10. SEO Mofo on 14 July, 2010  #link

    What bothers me the most is the “circumcised creativity” effect. Google is bottle-necking the evolution of the Web. How many good ideas have fallen by the wayside because of the webmaster’s fear of “how will Google handle this?”

    For me personally, the most difficult problem in SEO has been finding the outer limit of Google engineers’ abilities/imaginations and making sure my “brilliant ideas” don’t cross that line. I eventually realized that advanced/creative/innovative web development techniques are just as much of a threat to rankings as blatant spam techniques…if not more so.

  11. nyhl on 14 July, 2010  #link

    I think everything that’s too much can harm your site either too much spam and too much playing safe. If your spamming the search engines will be there to take you down and if your playing safe and rank well your own competitor sites will sabotage and report your site. It all depends on your knowledge and limits about this cloaking process then maybe you’ll be lucky not to flag as spam.

  12. Riparazioni Torino on 24 July, 2010  #link

    Using your technique BMW was banned from Google. http://blogoscoped.com/archive/2006-02-04-n60.html
    This sucks and I wanted to do something similar to a client’s website and got really discouraged because of this. I am not sure if it’s ethical from Google but I don’t think we have another chance but to live it it.

    [”Whitehat cloaking” sure as hell is “ethical from Google” ;-) As a matter of fact, you can cloak lots of other things besides hidden text and links on sneakily redirecting doorway pages …]

  13. […] Cloaking is Good for You. Just Ignore Bing’s/Google’s Guidelines. (Sebastian’s Pamphlets): Sebastian X makes a good argument about the benefits of cloaking for user experience. […]

Leave a reply


[If you don't do the math, or the answer is wrong, you'd better have saved your comment before hitting submit. Here is why.]

Be nice and feel free to link out when a link adds value to your comment. More in my comment policy.