Search engines should make shortened URIs somewhat persistent

URI shorteners are crap. Each and every shortened URI expresses a design flaw. All –or at least most– public URI shorteners will shut down sooner or later, because shortened URIs are hard to monetize. Making use of 3rd party URI shorteners translates to “put traffic at risk”. Not to speak of link love (PageRank, Google juice, link popularity) lost forever.

SEs could rescue tiny URLsSearch engines could provide a way out of the sURL dilemma that Twitter & Co created with their crappy, thoughtless and shortsighted software designs. Here’s how:

Most browsers support search queries in the address bar, as well as suggestions (aka search results) on DNS errors, and sometimes even 404s or other HTTP response codes other than 200/3x. That means browsers “ask a search engine” when an HTTP request fails.

When a TLD is out of service, search engines could have crawled a 301 or meta refresh from a page formerly living on a .yu domain for example. They know the new address and can lead the user to this (working) URI.

The same goes for shortened URIs created ages ago by URI shortening services that died in the meantime. Search engines have transferred all the link juice from the shortened URI to the destination page already, so why not point users that request a dead short URI to the right destination?

Search engines have all the data required for rescuing short URIs that are out of service in their datebases. Not de-indexing “outdated” URIs belonging to URI shorteners would be a minor tweak. At least Google has stored attributes and behavior of all links on the Web since the past century, and most probably other search engines are operated by data rats too.

URI shorteners can be identified by simple patterns. They gather tons of inbound links from foreign domains that get redirected (not always using a 301!) to URIs on other 3rd party domains. Of course that applies to some AdServers too, but rest assured search engines do know the differences.

So why the heck didn’t Google, Yahoo/MSN Bing, and Ask offer such a service yet? I thought it’s all about users, but I might have misread something. Sigh.

By the way, I’ve recorded search engine misbehavior with regard to shortened URIs that could arouse Jack The Ripper, but that’s a completely other story.

Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments

10 Comments to "Search engines should make shortened URIs somewhat persistent"

  1. Stuart Livesey on 7 October, 2009  #link

    While I’ve used URL shorteners from time to time with links that point people to other sites I’ve never used them on links to our own or to clients sites for the simple reason that there was never ever any guarantee they’d be around in a year or two.

    Sadly we’ve seen a lot of people jump on bandwagons without giving any thought to what might happen next and these URL shorts are only just one example. I hope people will learn the lesson that’s so obvious here but I bet they won’t.


  2. Kevin Spence on 7 October, 2009  #link

    This is great brain candy.

    The issue, I think, is about more than just shortened URLs. Many social sites (facebook or myspace for example) redirect all links through their own URLs.

    What if Facebook or Myspace falls? All those links disappear. Could we reach a point where websites become too big to fail and we have to bail them out like the auto industry and banks?

  3. @Drivelry on 7 October, 2009  #link

    Read via a @Blogbloke tweeting this.

    It’s a good idea. Worth posting to Google’s Webmaster Help Forum.

  4. Manley on 7 October, 2009  #link

    I was watching you tweeting about shorteners and wondered where you were going. I have to admit that I was suspecting techy tin foil hattery, but I am pleasantly surprised.

  5. Michael Thomas on 7 October, 2009  #link

    I can’t see these shortened URLs help at all as they are effectively brand new URLs. If you did some link building then possibly they would have some relevance but why would you bother to do that?

  6. Sebastian on 8 October, 2009  #link

    Manley, perhaps some “techy tin foil hattery” about evil URI shorteners and search engines is coming soon. Actually, this one was scheduled as part of another post. Watching URI shorteners shutting down in a row moved me to post this upfront.

  7. Sebastian on 8 October, 2009  #link

    Kevin, this would work with any service that obfuscates (redirects, shortens, …) URIs for traffic theft or other evil purposes, even Facebook and MySpace.

  8. […] Search engines should make shortened URIs somewhat persistent, […]

  9. Y8 on 12 October, 2009  #link

    You are so right. I hate, hate, hate the shortens URI. Btw I will keep reading your blog Sebastian, as I have read many interesting things here. Have a nice day. :)

  10. […] for a dose of techy tin foil hattery. Again, I’m going to rant about a nightmare that Twitter & Co created with their crappy, […]

Leave a reply

[If you don't do the math, or the answer is wrong, you'd better have saved your comment before hitting submit. Here is why.]

Be nice and feel free to link out when a link adds value to your comment. More in my comment policy.