While releasing a Googlebot Spoofer allowing my clients to check their browser optimization for search engine crawler responses, I was wondering again why major search engines tolerate hardcore cloaking to a great degree. I can handle my clients’ competition flooding the engines with zillions of doorway pages and alike, so here are no emotions involved. I just cannot understand why the engines don’t enforce compliance to their guidelines. That’s beyond any logic, thus I’m speculating:
They don’t care. If they would go after spamindexing, they would lose a few billions of indexed pages. That’ll be a very bad PR effect, absolutely unacceptable.
They have other priorities. Focusing on local search, they guess the problem solves itself, because it’s not very probable that a spammer resides close to the search engine user seeking a pizza service and landing in a PPC popup hell. Just claim it ain’t broke, so why fix it?
They believe spam detection is unethical. ‘Don’t be evil’ can be interpreted as ‘You can cheat us using black hat methods. We won’t make use of your own medicine to strike back’. Hey, this interpretation makes sense! Translation for non-geeks: ‘Spoofing is as evil as cloaking, cloaking cannot be discovered without spoofing, and since we aren’t evil, we encourage you to cloak’.
Great. Tomorrow I’ll paint my white hat black and add a billion or more sneaky pages to everyone’s index.
Seriously, I’ve a strong gut feeling the above said belongs to the past pretty soon. The engines changing their crawler’s user agent names to ‘Mozilla…’ could learn to render their spider food and to pull it from unexpected IP addresses. With all respect to successful black hat SEOs, I believe that white hat search engine optimization is a good business decision, probably on the long haul even in competitive industries.
Share/bookmark this: del.icio.us • Google • ma.gnolia • Mixx • Netscape • reddit • Sphinn • Squidoo • StumbleUpon • Yahoo MyWeb
Subscribe to Entries Comments All Comments