I’m not exactly a fan of non-crawlable content, but here it goes, provided by Joshua Titsworth (click on a tweet in order to favorite and retweet it):
What has to be said
Google’s link policy is ape shit. They don’t even believe their own FUD. So don’t bother listening to crappy advice anymore. Just link out without fear and encourage others to link to you fearlessly. Do not file reinclusion requests in the moment you hear about a Googlebug on your preferred webmaster hangout, because you might have spread a few shady links by accident, and don’t slaughter links you’ve created coz the almighty Google told you so.
Just because a bazillion of flies can’t err, that doesn’t mean you’ve to eat shit.
Is anybody aware of the fact that the prerequisite of Google’s absurd link policy is their algo’s inability to judge links?
First they’ve created a huge link economy with their silly PageRank toolbar and green pixels spread all over their DMOZ clone.
Next they were totally surprised that their shiny system was easy to manipulate, and that sooo evil webmasters are clever at all.
Instead of developing tools to rate, rank and judge links independently of the source’s PageRank, they declared war on HTML’s A element.
Of course, it’s way easier to delegate responsibility for “natural” linkage to the world outside building 43, providing only a crappy …
… rel-nofollow instrument that, by the way, suffered from several redefines confusing the hell out of webmasters over the years, than …
… hiring a few bright folks to create sensible methods and procedures to actually interpret links as provided in the markup.
IIRC I earned my first penalty for huge artificial link swaps way back in 2001 or so, maybe earlier.
One should think that a bunch of smart engineers can figure out how to handle links for ranking purposes in a decade, right? Not.
Frankly, I suspect Google can qualify links by now, but out of sheer evilness they still bother us with outlandish rules on googley linkage.
Why don’t they say “THX for your assistance & gazillions of wasted man years for implementing our rules, we don’t need this crap anymore” ?
That’ll be the geek way to handle a major clusterfuck. Coz they still nebulize the truth, they’re un-geeky.
I don’t know a worse cuss word than un-geeky. #thatisall
Google is heavily promoting its confession booth reinclusion request self-reporting feature, wonder why? maybe they are just guessing you might have some shady links or they just slap those notifictions right and left randomly hoping people would report themselves helping them “improve their algo”… Some outing-powered algo, it appears.
But what better way to spot a site that has an SEO hawking over it than looking at a link profile and NOT seeing any shit links? That would take all the guesswork out of it
Yup, they’re driven by fear of failure. If their link rating routine could fail on a spammy link, it’s better to penalize a couple dozen sites applying the throw-bathtubs-with-babies principle than getting their stuff together.
Unfortunately, they also create fear in innocent webmaster souls all across the planet.
Some call that behavior evil. I’m truly not a Google hater by passion, so I call it plain bullshit.
Listen, as long as I can launch a gazillion of spammy links –that actually help the site I’m spamming for, despite Google’s denials–, a competitor can do that to me, falsely assuming that this alone will tank the link target on Google’s SERPs (in fact I’d say thanks for the promotional efforts).
So what does that mean?
(1) As long as Google for some reason can’t devalue the boost effect of my low-life links in their ranking algos, that’s solely their problem, as long as I don’t own their search engine (what should be the case by now, considering all the free advice I’m providing Google with in my pamphlets … #unfair).
(2) As for the negative SEO purposes, see (1). Of course the methods and procedures of an SEO in this business are more diverse.
That’s the ideal world where artificial links launched by either the site owner or others lack impact. In fact Google has created scattered gray zones where spammy links indeed can harm a site’s rankings, just because they couldn’t resist to implement penalties for non-googley linkage (lazy and un-geeky approach). These parts of their algos are obsolete. Instead they should have invested more time into fine-tuning their link judgement (the geek way to get things done).
I have a site that is Google de-indexed but ranks well in Bing and Yahoo. That site is one of the funnest sites to work on cause I can build links anywhere and not worry about Google freaking out.
So… now what? Seven days ago I would have agreed wholeheartedly but now this penguin shit is flipping the SERPs upside-down and seemingly prioritizing the sites that no one wants to link to. Or completely irrelevant sites. Or empty sites.
Now it seems like Google would rather show my unrelated pages instead of the very specific matches that have a decent backlink profile. Whatever - as long as they don’t have to show pages that put effort in to promoting their content!
Alright, that’s my counter-rant. I’m going to go back to staring at numbers until I imagine some patterns.
Google is heavily promoting its confession booth reinclusion request self-reporting feature, wonder why? maybe they are just guessing you might have some shady links or they just slap those notifictions right and left randomly hoping people would report themselves helping them “improve their algo”… Some outing-powered algo, it appears.
But what better way to spot a site that has an SEO hawking over it than looking at a link profile and NOT seeing any shit links? That would take all the guesswork out of it
Yup, they’re driven by fear of failure. If their link rating routine could fail on a spammy link, it’s better to penalize a couple dozen sites applying the throw-bathtubs-with-babies principle than getting their stuff together.
Unfortunately, they also create fear in innocent webmaster souls all across the planet.
Some call that behavior evil. I’m truly not a Google hater by passion, so I call it plain bullshit.
Listen, as long as I can launch a gazillion of spammy links –that actually help the site I’m spamming for, despite Google’s denials–, a competitor can do that to me, falsely assuming that this alone will tank the link target on Google’s SERPs (in fact I’d say thanks for the promotional efforts).
So what does that mean?
(1) As long as Google for some reason can’t devalue the boost effect of my low-life links in their ranking algos, that’s solely their problem, as long as I don’t own their search engine (what should be the case by now, considering all the free advice I’m providing Google with in my pamphlets … #unfair).
(2) As for the negative SEO purposes, see (1). Of course the methods and procedures of an SEO in this business are more diverse.
That’s the ideal world where artificial links launched by either the site owner or others lack impact. In fact Google has created scattered gray zones where spammy links indeed can harm a site’s rankings, just because they couldn’t resist to implement penalties for non-googley linkage (lazy and un-geeky approach). These parts of their algos are obsolete. Instead they should have invested more time into fine-tuning their link judgement (the geek way to get things done).
I have a site that is Google de-indexed but ranks well in Bing and Yahoo. That site is one of the funnest sites to work on cause I can build links anywhere and not worry about Google freaking out.
So… now what? Seven days ago I would have agreed wholeheartedly but now this penguin shit is flipping the SERPs upside-down and seemingly prioritizing the sites that no one wants to link to. Or completely irrelevant sites. Or empty sites.
Now it seems like Google would rather show my unrelated pages instead of the very specific matches that have a decent backlink profile. Whatever - as long as they don’t have to show pages that put effort in to promoting their content!
Alright, that’s my counter-rant. I’m going to go back to staring at numbers until I imagine some patterns.