Follow-up post - see why e-commerce software sucks.
Erol, offering not that cheap SEO services itself, claims that it is perfectly OK to show Googlebot a content page without gimmicks, whilst human users get redirected to another URL.
Erol victims suffer from deindexing of all Erol-driven pages, Google just keeps pages in the index which do not contain Erol’s JS code. Considering how many online shops make use of Erol software in the UK, this massive traffic drop may have a visible impact on the gross national product … Ok, sorry, kidding with so many businesses at risk does not amuse the Queen.
Dear “SEO experts” at Erol, could you please read Google’s quality guidelines:
· Don’t […] present different content to search engines than you display to users, which is commonly referred to as “cloaking.”
· Don’t employ cloaking or sneaky redirects.
· If a site doesn’t meet our quality guidelines, it may be blocked from the index.
Google did your customers a favour by not banning their whole sites, probably because the standard Erol content presentation technique is (SEO-wise) just amateurish, not caused by deceitful intent. So please stop whining
We are currently still investigating the recent changes Google have made which have caused some drop-off in results for some EROL stores. It is as a result of the changes by Google, rather than a change we have made in the EROL code that some sites have dropped. We are investigating all possible reasons for the changes affecting some EROL stores and we will, of course, feedback any definitive answers and solutions as soon as possible.
and listen to your customers stating
Hey Erol Support
Maybe you should investigate doorway pages with sneaky redirects? I’ve heard that they might cause “issues” such as full bans.
victims customers the truth, they deserve it.
Update: It seems that there’s a patch available. In Erol’s support forum member Craig Bradshaw posts “Erols new patch and instructions clearly tell customers not to use the page loading messages as these are no longer used by the software.”.
Matt Cutts December 27, 2006 and December 28, 2006: “We have written about sneaky redirects in our webmaster guidelines for years. The specific part is ‘Don’t employ cloaking or sneaky redirects.’ We make our webmaster guidelines available in over 10 different languages … Ultimately, you are responsible for your own site. If a piece of shopping cart code put loads of white text on a white background, you are still responsible for your site. In fact, we’ve taken action on cases like that in the past. … If for example I did a search […] and saw a bunch of pages […], and when I clicked on one, I immediately got whisked away to a completely different url, that would be setting off alarm bells ringing in my head. … And personally, I’d be talking to the webshop that set that up (to see why on earth someone would put up pages like that) more than talking to the search engine.”
Matt Cutts heads Google’s Web spam team and has discussed these issues since the stone age at many places. Look at the dates above, penalties for cloaking / JS redirects are not a new thing. The answer to “It is as a result of the changes by Google, rather than a change we have made in the EROL code that some sites have dropped.” (Erol statement) is: Just because you’ve got away so long that does not mean that JS redirects are fine with Google. The cause of the mess is not a recent change of code, it’s the architecture by itself which is considered “cloaking / sneaky redirect” by Google. Google recently has improved its automated detection of client sided redirects, not its guidelines. Considering that both Erol created pages (the crawlable static page and the contents served by the URL invoked by the JS redirect) present similar contents, Google will have sympathy for all reinclusion requests, provided that the sites in question were made squeaky-clean before.
Share/bookmark this: del.icio.us • Google • ma.gnolia • Mixx • Netscape • reddit • Sphinn • Squidoo • StumbleUpon • Yahoo MyWeb
Subscribe to Entries Comments All Comments