If your Web site was banned by Google

If your Web site was banned by Google for reasons like hidden text, invisible links, client-sided instant redirects, doorway pages etc., chances are the ban is limited to 30 days or a few months only. When you search for your domain name and you get a result page stating “Google knows zilch about that shady site”, and you previously had some listings on Google’s SERPs, then:

Save all your server logs and extract each and every request by a Googlebot.

Shortly after banning a site Google usually will drastically reduce its crawling frequency. That is Googlebot starts to check for suspected stuff, and no longer crawls for indexing purposes.

Look at every page requested by Googlebot. Double-check it for hidden stuff and artificial linkage. Fix the on-page mistakes (polite description for over-optimization). Delete the page if it is part of a thin-page series (high amounts of pages carrying low amounts of repetitive but keyword optimized textual content, a.k.a. “doorway pages”). Delete all (thin) pages which do a client-sided redirect to the homepage or a profitable landing page. “Deletion” means physical removal, not redirection to a clean page. If your doorway pages don’t respond with a honest 404 when Googlebot revisits them, the ban will not be lifted. Consider canned site-search results, thin product pages with full navigation (e.g. only SKU, name and image), and stuff like that shady too. If you think those pages are helpful for visitors though, then make sure SE crawlers cannot fetch or even index it.

Hire a professional SEO for a last check and a second opinion as well. Removing questionable stuff is a good opportunity to implement effective optimization.

As soon as the crawling frequency goes back to the old cadence, and you’re sure your site is clean, file a reinclusion request. Write up honestly what you did to cheat Google, explain how you’ve fixed your stuff, and why it can’t happen again.

Keep in mind that there is no such thing as a second successful reinclusion request. That means if you cheat again, even unintentionally, your site is toast.

If your site was suspended for 30 days or so, it can reappear on the SERPs even without a reinclusion request. However, filing a reinclusion request should not hurt, and doing it before an estimated algorithmic reinstatement can speed up the process, if the initial penalty was a hand job, which seems to require a human review to lift the ban.

Best of luck!

Tags: ()



Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments
 

Is the spam condom efficient and ethical?

Jim Boykin from WeBuildPages raises a few very good questions in his 2-part-essay on link condoms in blog comments. Jim finally asks “Is the rel=nofollow our friend or our enemy?” and I’ve no definite answer.

If Blogger would allow me to opt out of the comment condom thingy I would do it with this blog. When I don’t delete a comment containing a link, then the poster has something to say, and an embedded link doesn’t deserve castration regardless whether I agree or not. Well, perhaps I’d unlink overdone URL drops in some cases.

If I would run a popular blog, I’d like a white-list approach best. That is every link in comments gets sterilized by default and all posts are pre-moderated, captchas in place. Trusted users could post instantly without link condom, and I could pull the condom from particular comments. I’m not aware of any blog software handling it this way, unfortunately.

Is the spam condom efficient? Nope. Comment moderation, captchas, spam filters, perhaps even registering users is enough to prevent a blog from comment spam. Also, many blogs run outdated, never updated pre-nofollow software, that is savvy spammers can still inject crappy links at enough places to keep it profitable.

Is the spam condom ethical? Nope. At least not when the blogger can’t opt out. Not every comment is spam. Comments add content to a blog. Why penalize the content vendors?

Tags: without



Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments
 

Thanks folks

Thank you all for reading my blog and clicking the ads charity links that much. In 2005 I didn’t manage to become a real blogger, and I won’t mutate to a frequent writer next year, because 2006 will be the year of splogging auto-blogging in huge blog-networks, and I can’t output interesting stuff in a competitive cadence. However, stay tuned for a nice gem in January or February at the latest.

I wish you and yours an awesome holiday
Sebastian



Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments
 

Google to provide SEO services to AOL?

John Battelle’s post on the Google-AOL deal contains a very interesting snippet in the updates:

….Google will also provide technical assistance so AOL can create Web pages that will appear more prominently in the search results list. But this assistance will not change computer formulas that determine the order in which pages are listed in Google’s search results.

To do a SEO job like this one I’d even ask Google for employment. Seriously, Google should hire professional SEOs for this task.

Tags: () &



Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments
 

SEO for Consulting Firms, Laywers, Tax Advisors …

To attract targeted search engine traffic, a consulting firm must publish all business secrets on the ‘Net. Well, that does not mean the payroll and the balance sheet, but I needed a provoking slogan for a piece I wrote on a suitable SEO strategy for consultants, which by definition do not tell anything without a fee payed upfront ;)

Asking why so many consulting firms lack search engine visibility leads to a simple conclusion: they hide themselves on the Web. They do actively prevent search engines from ranking their Web sites in top spots on the search result pages, although they spend shitloads of jolly green giants to operate fancy Web sites, which please the ego but not the engines or even the user.

In their constant fear to reveal knowledge which may be sellable some day, they praise their genius in terrific mission statements and generic visions, but they don’t put up any indexable content with the potential to rank for solutions and services they provide.

I hope it’s a good read: A SEO Strategy for Consulting Firms

Related link: Web Logs for Lawyers: Lessons from Ernie the Attorney

Tags: ()



Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments
 

SQUIDOO Impressions

Thanks to Peter’s reminder I’ve added a lens to Seth Godin’s content network, Squidoo BETA. The Lensmaster Workshop makes it easy to add text content, images, links lists, RSS feeds and whatever. Even a technically challenged expert on a topic should be able to put together a nice page within minutes.

Squidoo promises to share the ad revenue, but a few shared AdSense cents don’t make a goodie. If Squidoo ever gains an authority status like Wikipedia or the ODP, a well linked page in this network can help to move a Web site into a good neighborhood of related high ranked sites. For example if you have a Web site dedicated to foo, then link to all great foo related resources including your site from your foo lens, and wait.

Unfortunately, most likely Squidoo will never become a trusted authority from a search engine’s point of view; from the sales pitch:

WHO SHOULD BUILD A LENS?

You should, if you…

1. …have a Web site and you’re not happy with your PageRank in Google, a lens will increase it. That’s because a lens provides exactly what search engines are looking for: authoritative insight so people can find what they’re looking for.

Sounds like a fox’ free raid into the hen house. Inviting link spammers to flood a non-audited content network with crap is plain weird. LensRank will not be enough to close the loopholes:

Wikipedia has a system with one entry per topic. We don’t. Instead, we encourage multiple lenses on a topic. Then, we use an automated algorithm—LensRank—to rank the lenses. We look at user ratings, lensmaster reputation, clickthrough rates, frequency of updates, inbound and outbound links, and other factors and give the lens a number. And we make it clear to the lensmaster what her rank is and how to improve it.

There is nothing a savvy spammer can’t abuse with ease.

Besides potential spam issues, I do like Squidoo.

Tags: ()



Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments
 

WMW Gem - Don’t optimize for keywords

Desperate optimizing for paricular keyword phrases can kill way better converting SE traffic for natural search terms.

In a usually pretty useless Google-Update-Thread MHes provides a real gem, start reading with message 183 of “Dealing With Consequences of Jagger Update” in the Google forum. If you want to hear it from the horses mouth, then listen to this Matt Cutts interview at Webmaster Radio, spotted via TreadWatch.

Actually that’s not a new thing. Writing a (longer) natural copy triggers more search queries than a (short) page heavily targeting the apparently money term harvested from various keyword research tools. It’s a good idea to support longer pages with short pieces highlighting particular terms, e.g. footnote pages, glossary pages and so on, but the large page usually generates the most sales.

Tags: ()



Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments
 

Ego Food @ Aaron Pratt’s SeoBuzzBox

Aaron Pratt from SEO Buzz Box kindly gave me the opportunity to please my ego by talking about SEO, Google Sitemaps, online consulting, and Guinness. Thank you Aaron for this interview.



Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments
 

How to get trusted inbound links

Post Jagger the vital question is how a Web site can acquire trusted authority links. Well, I can’t provide the definitive answer, but a theory and perhaps a suitable methodology.

Mutate from a link monkey to a link ninja. Follow Google’s approach to identify trustworthy resources. Learn to spot sources of TrustRank, then work hard to attract their attention (by providing outstanding content for example). Don’t bother with link requests, be creative instead. Investing a few days or even weeks to gain a trusted inbound link is worth the efforts. Link quality counts, quantity may even be harmful.

Something to start with: DMOZ –in parts– has a high TrustRank, but a DMOZ link alone may harm, because Google knows that a handful of editors aren’t that honest. A Yahoo listing can be used to support an established site having trusted inbound links already, but alone or together with an ODP link it may hurt too, because it’s that easy to get.

Other sites with a high TrustRank are Google.com and other domains owned by Google like their blogs (tough, but not impossible to get a link from Google), W3C.org, most pages on .edu and .gov domains, your local chamber of commerce, most newspapers … just to give a few examples.

I bet Matt Cutt’s blog OTOH has a pretty low TrustRank, because he is obviously part of a ‘very bad neighborhood’, albeit his very honorable intentions. Also the SEO community including various stealthy outlets is a place to avoid if you’re hunting trusted links.

More information: How to Gain Trusted Connectivity

Tags: ()



Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments
 

New feed: my last updates

I’ve revamped my What’s New page today, because I’ve realized that I don’t write new articles and tutorials that often. Mostly I add a page to an existing document, or update pages with fresh information. Both the site feed and the updates page didn’t reflect this behavoir. Here is the update feed:

http://www.smart-it-consulting.com/last-updates.rss



Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments
 

« Previous Page  1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28  Next Page »