Better don’t run a web server under Windows

IIS defaults can produce serious troubles with search engines. That’s a common problem and not even all (UK Government National
Health Service) admins have spotted it. I’ve alerted the Whipps Cross University Hospital but can’t email all NHS sites suffering from IIS and lazy or uninformed webmasters. So here’s the fix:

Create a server without subdomain, then go to the “Home Directory” tab and click the option “Redirection to a URL”. As “Redirect to” enter the destination, for example “$S$Q”, without a slash after “.uk” because the path ($S placeholder) begins with a slash. The $Q placeholder represents the query string. Next check “Exact URL entered above” and “Permanent redirection for this resource”, and submit. Test the redirection with a suitable tool.

Now when a user enters a URL without the “www” prefix s/he gets the requested page from the canonical server name. Also search engine crawlers following non-canonical links like will transmit the link love to the desired URL, and will index more pages instead of deleting them in their search indexes after a while because the server is not reachable. I’m not joking. Under some circumstances all or many www-URLs of pages referenced by relative links resolving to the non-existent server will get deleted in the search index after a couple of unsuccessfull attempts to fetch them without the www-prefix.

Hat tip to Robbo
Tags: ()

Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments

Yahoo Pipes jeopardizes the integrity of the Internet

Update: This post, initially titled “No more nofollow-insane at Google Reader”, then updated as “(No) more nofollow-insane at Google Reader”, accused Google Reader of inserting nofollow crap. I apologize for my lazy and faulty bug report. Read the comments.

I fell in love with Yahoo pipes because that tool allowed me to funnel the tidbits contained in a shitload of noise into a more or less clear signal. Instead of checking hundreds of blog feeds, search query feeds and whatever else, I was able to feed my preferred reader with actual payload extracted from vast loads of paydirt digged from lots of sources.

Now that I’ve learned that Yahoo pipes is evil I guess I must code the filters myself. Nofollow insane is not acceptable. Nofollow madness jeopardizes the integrity of the Internet which is based on free linkage. I don’t need no stinking link condoms sneakily forced by nice looking tools utilizing nifty round corners. I’ll be way happier with a crappy and uncomfortable PHP hack feeded with OPML files and conditions pulled from a manually edited MySQL table.

Here is the evidence right from the Yahoo pipe output:
Also, abusing my links with target=”_blank” is not nice.

Initial post and its first update:

I’m glad Google has removed the auto-nofollow on links in blog posts. When I add a feed I trust its linkage and I don’t need no stinking condoms on pages nobody except me can see unless I share them. Thanks!

Update - Nick Baum, can you hear me?

It seems the nofollow-madness is not yet completely buried. Here is a post of mine and what Google Reader shows me when I add my blog’s feed:
Click to enlarge
And here is the same post filtered thru a Yahoo pipe:
Click to enlarge
So please tell me: why does Google auto-nofollow a link to Vanessa Fox when she gets linked via Yahoo, and uncondomizes the link from Google’s very own blogspot dot dom? Curious …

Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments

5 Reasons why I blog

So since Matt Cutts tagged by Vanessa Fox cat-tagged me 5 times ;) I add my piece.

    Napping cats don't listen
  1. Well, I’ve started this blog because every dog and his grandpa blogs, but the actual reason was, that I couldn’t convince my beloved old cat listening my rants any more. Sadly my old comrade died years ago in the age of 15, leaving alone a gang of two legged monsters rampage in house and garden.
  2. Since then I’ve used my blogs for kidding, bollocks, and other stuff not suitable for more or less static sites where I publish more seriously. However, I’ve scraped some wholehearted posts from the blog to put them on the consulting platform, because this site is way more popular. Vice versa I’ve blogged announced my other articles and projects here. This blog is somewhat a playground to test the waters and concurrently a speaking tube. I still find it difficult to do that with another platform, the timely character of blogging perfectly allows burying of half-baked things.
  3. Every now and then I write an open letter to Google, for example my series of pleas to revamp rel=nofollow. Perhaps a googler is listening ;)
    Also, a blog is a neat instrument to get the attention of folks who don’t seem to listen.
  4. Frankly I like to share ideas and knowledge. Blogging is the perfect platform to raise rumors or myths too. Also, writing helps me to structure my thoughts, this works even better in a foreign language.
  5. Last but not least I use my blog as reference. While providing Google user support sometimes I just drop a link, particulary as answer to repetitive questions. By the way Google’s Webmaster Forum is a nice place to chase SEO tidbits straight from the horse’s mouth.

Although I admit I’ve somewhat tag-baited my way in here, I’m tagging you:
Thu Tu
John Müller
John Honeck
Jim Boykin
Gurtie & Chris

Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments

Four reasons to get tanked on Google’s SERPs

You know I find “My Google traffic dropped all of a sudden - I didn’t change anything - I did nothing wrong” threads fascinating. Especially posted with URLs on non-widgetized boards. Sometimes I submit an opinion, although the questioners usually don’t like my replies, but more often I just investigate the case for educational purposes.

Abstracting a fair amount of tanked sites I’d like to share a few of my head notes respectively theses as food for thoughts. I’ve tried to put these as generalized as possible, so please don’t blame me for the lack of a detailed explanation.

  1. Reviews and descriptions ordered by product category, product line, or other groupings of similar products, tend to rephrase each other semantically, that is in form and content. Be careful when it comes to money making areas like travel or real estate. Stress unique selling points, non-shared attributes or utilizations, localize properly and make sure reviews respectively descriptions don’t get spread in-full internally on crawlable pages as well as externally.
  2. Huge clusters of property/characteristic/feature lists under analogical headings, even unstructured, may raise a flag when the amount of applicable attributes is finite and values are rather similar with just few of them totally different respectively expressions of trite localization.
  3. The lack of non-commercial outgoing links on pages plastered with ads of any kind, or pages at the very buttom of the internal linking hierarchy, may raise a flag. Nofollow’ing, redirecting or iFraming affiliate/commercial links doesn’t prevent from breeding artificial page profiles. Adding unrelated internal links to the navigation doesn’t help. Adding Wikipedia links in masses doesn’t help. Providing unique textual content and linking to authorities within the content does help.
  4. Strong and steep hierarchical internal/navigational linkage without relevant crosslinks and topical between-the-levels linkage looks artificial, especially when the site in question lacks deep links. Look at the ratio of home page links vs. deep links to interior pages. Rethink the information architecture and structuring.

Take that as call for antithesis or just stuff for thoughts. And keep in mind that although there might be no recent structural/major/SEO/… on-site changes, perhaps Google just changed her judgement on the ancient stuff ranking forever, and/or has just changed the ability of existing inbound links to pass weight. Nothing’s set in stone. Not even rankings.

Tags: ()

Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments

Link monkey business is not worth a whoop

Old news, pros move on educating the great unlinked.

A tremendous amount of businesses maintaining a Web site still swap links in masses with every dog and his fleas. Serious sites join link exchange scams to gain links from every gambling spammer out there. Unscrupulous Web designers and low-life advisors put gazillions of businesses at risk. Eventually the site owners pop up in Google’s help forum wondering why the heck they lost their rankings despite their emboldening toolbar PageRank. Told to dump all their links pages and to file a reinclusion request they may do so, but cutting one’s loss short term is not the way the cookie crumbles with Google. Consequences of listening to bad SEO advice are often layoffs or even bust.

In this context a thread titled “Do the companies need to hire a SEO to get in top position?” asks the somewhat right question but may irritate site owners even more. Their amateurish Web designer offering SEO services obviously got their site banned or at least heavily penalized by Google. Asking for help in forums they get contradictory SEO advice. Google’s take on SEO firms is more or less a plain warning. Too many scams sailing under the SEO flag and it seems there’s no such thing as reliable SEO advice for free on the net.

However, the answer to the question is truly “yes“. It’s better to see a SEO before the rankings crash out. Unfortunately, SEO is not a yellow pages category, and every clown can offer crappy SEO services. Places like SEO Consultants and honest recommendations get you the top notch SEOs, but usually the small business owner can’t afford their services. Asking fellow online businesses for their SEO partner may lead to a scammer who is still beloved because Google has not yet spotted and delisted his work. Kinda dilemma, huh?

Possible loophole: once you’ve got a recommendation for a SEO skilled Webmaster or SEO expert from somebody attending a meeting at the local chamber of commerce, post that fellow’s site to the forums and ask for signs of destructive SEO work. Should give you an indication of trustworthiness.

Tags: ()

Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments


Nice closing words on “my stuff went supplemental” from JohnWeb.

Applying simplified conclusions to a complex SEO question reveal 20% of the truth whilst 80% are just not worth discussing because the efforts necessary to analyze one more percent equal the 20% analysis. The alternative is working with 20% reasonable conclusions plus 80% common sense.

Unfortunately, common sense is not as common as you might think. Just count the supplemental-threads across the board, then search for words of wisdom. Sigh.

Tags: ()
Update: Read Matt’s Google Hell

Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments

Che Guevara of Search

I just can’t step away from the keyboard … who’s this well known guy?
Che Guevara

Tags: ()

Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments

Follow-up on "Google penalizes Erol stores"

Background: these three posts on Google penalizing e-commerce sites.

Erol has contacted me and we will discuss the technical issues within the next days or maybe weeks or so. I understand this as a positive signal, especially because previously my impression was that Erol is not willing to listen constructive criticism, regardless Googles shot across the bow (more on that later). We agreed that before we come to the real (SEO) issues it’s a good idea to render a few points made in my previous posts more precisely. In the following I quote parts of Erol’s emails with permission:

Your blog has made for interesting reading but the first point I would like to raise with you is about the tone of your comments, not necessarily the comments themselves.

Question of personal style, point taken.

Your article entitled ‘Why eCommerce Systems Suck‘, dated March 12th, includes specific reference to EROL and your opinion of its SEO capability. Under such a generic title for an article, readers should expect to read about other shopping cart systems and any opinion you may care to share about them. In particular, the points you raise about other elements of SEO in the same article, (’Google doesn’t crawl search results’, navigation being ‘POST created results not crawlable’) are cited as examples of ways other shopping carts work badly in reference to SEO - importantly, this is NOT the way EROL stores work. Yet, because you do not include any other cart references by name or exclude EROL from these specific points, the whole article reads as if it is entirely aimed at EROL software and none others.

Indeed, that’s not fair. Navigation solely based on uncrawlable search results without crawler shortcuts or sheer POST results are definitely not issues I’ve stumbled upon while investigating penalized Erol driven online stores. Google’s problem with Erol driven stores is client sided cloaking without malicious intent. I’ve updated the post to make that clear.

Your comment in another article, ‘Beware of the narrow-minded coders‘ dated 26 March where you state: “I’ve used the case [EROL] as an example of a nice shopping cart coming with destructive SEO.” So by this I understand that your opinion is EROL is actually ‘a nice shopping cart’ but it’s SEO capabilities could be better. Yet your articles read through as EROL is generally bad all round. Your original article should surely be titled “Why eCommerce Systems Suck at SEO” and take a more rounded approach to shopping cart SEO capabilities, not merely “Why eCommerce Systems Suck”? This may seem a trivial point to you, but how it reflects overall on our product and clouds it’s capability to perform its main function (provide an online ecommerce solution) is really what concerns me.

Indeed, I meant that Erol is a nice shopping cart lacking SEO capabilities as long as not the major SEO issues get addressed asap. And I mean in the current version, which clearly violates Google’s quality guidelines. From what I’ve read in the meantime, the next version to be released in 6 months or so should eleminate the two major flaws with regard to search engine compatibility. I’ve changed the post’s title, the suggestion makes sense for me too.

I do not enjoy the traffic from search terms like “Erol sucks” or “Erol is crap” because that’s simply not true. As I said before I think that Erol is a well rounded software nicely supporting the business processes its designed for, and the many store owners using Erol I’ve communicated with recently all tell me that too.

I noted with interest that your original article ‘Why eCommerce Systems Suck’ was dated 12th March. Coincidentally, this was the date Google began to re-index EROL stores following the Google update, so I presume that your article was originally written following the threads on the Google webmaster forums etc. prior to the 12th March where you had, no doubt, been answering questions for some of our customers about their de-listing during the update. You appear to add extra updates and information in your blogs but, disappointingly, you have not seen fit to include the fact that EROL stores are being re-listed in any update to your blog so, once again, the article reads as though all EROL stores have been de-listed completely, never to be seen again.

With all respect, nope. Google did not reindex Erol driven pages, Google had just lifted a “yellow card” penalty for a few sites. That is not a carte blanque but in the opposite Google’s last warning before the site in question gets the “red card”, that is a full ban lasting at least a couple of months or even longer. As said before it means absolutely nothing when Google crawls penalized sites or when a couple of pages reappear on the SERPs. Here is the official statement: “Google might also choose to give a site a ‘yellow card’ so that the site can not be found in the index for a short time. However, if a webmaster ignores this signal, then a ‘red card’ with a longer-lasting effect might follow.”
(Yellow / red cards: soccer terminology, yellow is a warning and red the sending-off.)

I found your comments about our business preferring “a few fast bucks”, suggesting we are driven by “greed” and calling our customers “victims” particularly distasteful. Especially the latter, because you infer that we have deliberately set out to create software that is not capable of performing its function and/or not capable of being listed in the search engines and that we have deliberately done this in pursuit of monetary gain at the expense of reputation and our customers. These remarks I really do find offensive and politely ask that they be removed or changed. In your article “Google deindexing Erol driven ecommerce sites” on March 23rd, you actually state that “the standard Erol content presentation is just amateurish, not caused by deceitful intent”. So which is it to be - are we deceitful, greedy, victimising capitalists, or just amateurish and without deceitful intent? I support your rights to your opinions on the technical proficiency of our product for SEO, but I certainly do not support your rights to your opinions of our company and its ethics which border on slander and, at the very least, are completely unprofessional from someone who is positioning themselves as just that - an SEO professional.

To summarise, your points of view are not the problem, but the tone and language with which they are presented and I sincerely hope you will see fit to moderate these entries.

C’mon, now you’re getting polemic;) In this post I’ve admitted to be polemic to bring my point home, and in the very first post on the topic I clearly stated that my intention was not slandering Erol. However, since you’ve agreed to an open discussion of the SEO flaws I think it’s no longer suitable to call your customers victims, so I’ve changed that. Also in my previous post I’ll insert a link near “greed” and “fast bucks” pointing to this paragraph to make it absolutely clear that I did not meant what you insinuate when I wrote:

Ignorance is no excuse […] Well, it seems to me that Erol prefers a few fast bucks over satisfied customers, thus I fear they will not tell their cutomers the truth. Actually, they simply don’t get it. However, I don’t care whether their intention to prevaricate is greed or ignorance, I really don’t know, but all the store operators suffering from Google’s penalties deserve the information.

Actually, I still stand by my provoking comments because at this time they perfectly described the impression you’ve created with your actions respectively lack of fitly activities in the public.

  1. Critical customers asking whether the loss of Google traffic might be caused by the way your software handles HTML outputs in your support forums were downtrodden and censored.
  2. Your public answers to worried customers were plain wrong, SEO-wise. Instead of “we take your hints seriously and will examine whether JavaScript redirects may cause Google penalties or not” you said that search engines do index cloaking pages just fine, that Googlebot crawling penalized sites is a good sign, and all the mess is kinda Google hiccup. At this point the truth was out long enough, so your most probably unintended disinformation has worried a number of your customers, and gave folks like me the impression that you’re not willing to undertake the necessary steps.
  3. Offering SEO services yourself as well as forum talks praising Erol’s SEO experts don’t put you in a “we just make great shopping cart software and are not responsible for search engine weaknesses” position. Frankly that’s not conceivable as responsible management of customer expectations. It’s great that your next version will dump frames and JavaScript redirects, but that’s a bit too late in the eyes of your customers, and way too late from a SEO perspective, because Google never permitted the use of JavaScript redirects and all the disadvantages of frames were public knowledge since the glory days of Altavista, Excite and Infoseek, long before Google overtook search.

To set the record straight: I don’t think and never thought that you’ve greedily or deliberately put your customers at risk in pursuit of monetary gain. You’ve just ignored Google’s guidelines and best practices of Web development too long, but –as the sub-title of my previous post hints– ignorance is no excuse.

Now that we’ve handled the public relation stuff, I’ll look into the remaining information Erol sent over hoping that I’ll be able to provide some reasonable input in the best interest of Erol’s customers.

Tags: ()

Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments

Beware of the narrow-minded coders

or Ignorance is no excuse

Long winded story on SEO-ignorant pommy coders putting their customers at risk. Hop away if e-commerce software vs. SEO dramas don’t thrill you.

Recently I’ve answered a “Why did Google deindex my pages” question in Google’s Webmaster Forum. It turned out that the underlying shopping cart software (EROL) maintained somewhat static pages as spider fodder, which redirect human visitors to another URL serving the same contents client sided. Silly thing to do, but pretty common for shopping carts. I’ve used the case as an example of a nice shopping cart coming with destructive SEO in a post on flawed shopping carts in general.

Day by day other site owners operating Erol driven online shops popped up in the Google Groups or emailed me directly, so I realized that there is a darn widespread problem involving a very popular UK based shopping cart software responsible for Google cloaking penalties. From my contacts I knew that Erol’s software engineers and self-appointed SEO experts believe in weird SEO theories and don’t consider that their software architecture itself could be the cause of the mess. So I wrote a follow-up addressing Erol directly. Google penalizes Erol-driven e-commerce sites explaines Google’s take on cloaking and sneaky JavaScript redirects to Erol and its customers.

My initial post got linked and discussed in Erol’s support forum and kept my blog stats counter buzzy over the weekend. Accused of posting crap I showed up and posted a short summary over there:

Howdy, I’m the author of the blog post you’re discussing here: Why eCommerce systems suck

As for crap or not crap, judge yourself. This blog post was addressed to ecommerce systems in general. Erol was mentioned as an example of a nice shopping cart coming with destructive SEO. To avoid more misunderstandings and to stress the issues Google has with Erol’s JavaScript redirects, I’ve posted a follow-up: Google deindexing Erol-driven ecommerce sites.

This post contains related quotes from Matt Cutts, head of Google’s web spam team, and Google’s quality guidelines. I guess that piece should bring my point home:

If you’re keen on search engine traffic then do not deliver one page to the crawlers and another page to users. Redirecting to another URL which serves the same contents client sided gives Google an idea of intent, but honest intent is not a permission to cloak. Google says JS redirects are against the guidelines, so don’t cloak. It’s that simple.

If you’ve questions, post a comment on my blog or drop me a line. Thanks for listening


Next the links to this blog were edited out and Erol posted a longish but pointless charade. Click the link to read it in full, summarizing it tells the worried Erol victims that Google has no clue at all, frames and JS redirects are great for online shops, and waiting for the next software release providing meaningful URLs will fix everything. Ok, that’s polemic, so here are at least a few quotes:

[…] A number of people have been asking for a little reassurance on the fact that EROL’s x.html pages are getting listed by Google. Below is a list of keyword phrases, with the number of competing pages and the x.html page that gets listed [4 examples provided].
EROL does use frames to display the store in the browser, however all the individual pages generated and uploaded by EROL are static HTML pages (x.html pages) that can be optimised for search engines. These pages are spidered and indexed by the search engines. Each of these x.html pages have a redirect that loads the page into the store frameset automatically when the page is requested.
EROL is a JavaScript shopping cart, however all the links within the store (links to other EROL pages) that are added using EROL Link Items are written to the static HTML pages as a standard <a href=”"> links - not a JavaScript link. This helps the search engines spider other pages in your store.

The ’sneaky re-directs’ being discussed most likely relate to an older SEO technique used by some companies to auto-forward from an SEO-optimised page/URL to the actual URL the site-owner wants you to see.

EROL doesn’t do this - EROL’s page load actually works more like an include than the redirect mentioned above. In its raw form, the ‘x123.html’ page carries visible content, readable by the search engines. In it’s rendered form, the page loads the same content but the JavaScript rewrites the rendered page to include page and product layout attributes and to load the frameset. You are never redirected to another html page or URL. [Not true, the JS function displayPage() changes the location of all pages indexed by Google, and property names like ‘hidepage’ speak for themselves. Example: x999.html redirects to erol.html#999×0&&]
We have, for the past 6 months, been working with search engine optimisation experts to help update the code that EROL writes to the web page, making it even more search engine friendly.

As part of the recommendations suggested by the SEO experts, pages names will become more search engine friendly, moving way from page names such as ‘x123.hml’ to ‘my-product-page-123.html’. […]

Still in friendly and helpful mood I wrote a reply:

With all respect, if I understand your post correctly that’s not going to solve the problem.

As long as a crawlable URL like or resolves to×0&& or whatever that’s a violation of Google’s quality guidelines. Whether you call that redirect sneaky (Google’s language) or not that’s not the point. It’s Google’s search engine, so their rules apply. These rules state clearly that pages which do a JS redirect to another URL (on the same server or not, delivering the same contents or not) do not get indexed, or, if discovered later on, get deindexed.

The fact that many x-pages are still indexed and may even rank for their targeted keywords means nothing. Google cannot discover and delist all pages utilizing a particular disliked technique overnight, and never has. Sometimes that’s a process lasting months or even years.

The problem is, that these redirects put your customers at risk. Again, Google didn’t change its Webmaster guidelines which forbid JS redirects since the stone age, it has recently changed its ability to discover violations in the search index. Google does frequently improve its algos, so please don’t expect to get away with it. Quite the opposite, expect each and every page with these redirects vanishing over the years.

A good approach to avoid Google’s cloaking penalties is utilizing one single URL as spider fodder as well as content presentation to browsers. When a Googler loads such a page with a browser and compares the URL to the spidered one, you get away with nearly everything CSS and JS can accomplish — as long as the URLs are identical. If OTOH the JS code changes the location you’re toast.

Posting this response failed, because Erol’s forum admin banned me after censoring my previous post. By the way according to posts outside their sphere and from what I’ve seen watching the discussion they censor posts of customers too. Well, that’s fine with me since that’s Erol’s forum and they make the rules. However, still eager to help I emailed my reply to Erol, and to Erol customers asking for my take on Erol’s final statement.

You ask why I post this long winded stuff? Well, it seems to me that Erol prefers a few fast bucks over satisfied customers, thus I fear they will not tell their cutomers the truth. Actually, they simply don’t get it. However, I don’t care whether their intention to prevaricate is greed or ignorance, I really don’t know, but all the store operators suffering from Google’s penalties deserve the information. A few of them have subscribed to my feed, so I hope my message gets spread. Continuation

Tags: ()

Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments

Good Bye Nofollow: How to DOfollow comments with blogger

Andy Beard pointed me to a neat procedure to DOFOLLOW links in blog comments with Remove Nofollow Attribute on Blog Comments:

Edit the template’s HTML and remove “rel=’nofollow’” in this line:
<a expr:href='data:comment.authorUrl' rel='nofollow'><></a>

Now I’ve a good reason to upgrade the software. Sadly I’ve hacked the template so badly, I doubt it will work with the new version :(

Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments

« Previous Page  1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28  Next Page »