Save bandwidth costs: Dynamic pages can support If-Modified-Since too

Conditional HTTP GET requests make Webmasters and Crawlers happyWhen search engine crawlers burn way too much of your bandwidth, this post is for you. Crawlers sent out by major search engines (Google, Yahoo and MSN/Live Search) support conditional GETs, that means they don’t fetch your pages if those didn’t change since the last crawl.

Of course they must fetch your stuff over and over again for this comparision, if your Web server doesn’t play nice with Web robots, as well as with other user agents that can  cache your pages and other Web objects like images. The protocol your Web server and the requestors use to handle caching is quite simple, but its implementation can become tricky. Here is how it works:

1st request Feb/10/2008 12:00:00

Googlebot requests /some-page.php from your server. Since Google has just discovered your page, there are no unusual request headers, just a plain GET.

You create the page from a database record which was modified on Feb/09/2008 10:00:00. Your server sends Googlebot the full page (5k) with an HTTP header
Date: Sun, 10 Feb 2008 12:00:00 GMT
Last-Modified: Sat, 09 Feb 2008 10:00:00 GMT

(lets assume your server is located in Greenwich, UK), the HTTP response code is 200 (OK).

Bandwidth used: 5 kilobytes for the page contents plus less than 500 bytes for the HTTP header.

2nd request Feb/17/2008 12:00:00

Googlebot found interesting links pointing to your page, so it requests /some-page.php again to check for updates. Since Google already knows the resource, Googlebot requests it with an additional HTTP header
If-Modified-Since: Sat, 09 Feb 2008 10:00:00 GMT

where the date and time is taken from the Last-Modified header you’ve sent in your response to the previous request.

You didn’t change the page’s record in the database, hence there’s no need to send the full page again. Your Web server sends Googlebot just an HTTP header
Date: Sun, 17 Feb 2008 12:00:00 GMT
Last-Modified: Sat, 09 Feb 2008 10:00:00 GMT

The HTTP response code is 304 (Not Modified). (Your Web server can suppress the Last-Modified header, because the requestor has this timestamp already.)

Bandwidth used: Less than 500 bytes for the HTTP header.

3rd request Feb/24/2008 12:00:00

Googlebot can’t resist to recrawl /some-page.php, again using the
If-Modified-Since: Sat, 09 Feb 2008 10:00:00 GMT

header.

You’ve updated the database on Feb/23/2008 09:00:00 adding a few paragraphs to the article, thus you send Googlebot the full page (now 7k) with this HTTP header
Date: Sun, 10 Feb 2008 12:00:00 GMT
Last-Modified: Sat, 23 Feb 2008 09:00:00 GMT

and an HTTP response code 200 (OK).

Bandwidth used: 7 kilobytes for the page contents plus less than 500 bytes for the HTTP header.

Further requests

Provided you don’t change the contents again, all further chats between Googlebot and your Web server regarding /some-page.php will burn less than 500 bytes of your bandwidth each. Say Googlebot requests this page weekly, that’s 370k saved bandwidth annually. You do the math. Even with a medium-sized Web site you most likely want to implement proper caching, right?

Not only Webmasters love conditional GET requests that save bandwidth costs and processing time, search engines aren’t keen on useless data transfers too. So lets see how you could respond efficiently to conditional GET requests from search engines. Apache handles caching of static files (e.g. .txt or .html files you upload with FTP) differently from dynamic contents (script outputs with or without a query string in the URI).

Static files

Fortunately, Apache comes with native support of the Last-Modified / If-Modified-Since / Not-Modified functionality. That means that crawlers and your Web server don’t produce too much network traffic when a requested static file  didn’t change since the last crawl.

You can test your Web server’s conditional GET support with your robots.txt, or, if even your robots.txt is a script, create a tiny HTML page with a text editor and upload it via FTP. Another neat tool to check HTTP headers is the Live Headers Extension for FireFox (bear in mind that testing crawler behavior with Web browsers is fault-prone by design).

If your second request of an unchanged static file results in a 200 HTTP response code, instead of a 304, call your hosting service. If it works and you’ve only static pages, then bookmark this article and move on.

Dynamic contents

Everything you output with server sided scripts is dynamic content by definition, regardless whether the URI has a query string or not. Even if you just read and print out a static file –that never changes– with PHP, Apache doesn’t add the Last-Modified header which forces crawlers to perform further requests with an If-Modified-Since header.

With dynamic content you can’t rely on Apache’s caching support, you must do it yourself.

The first step is figuring out where your CMS or eCommerce software hides the timestamps telling you the date and time of a page’s last modification. Usually a script pulls its stuff from different database tables, hence a page contains more than one area, or block, of dynamic contents. Every block might have a different last-modified timestamp, but not every block is important enough to serve as the page’s determinant last-modified date. The same goes for templates. Most template tweaks shouldn’t trigger a full blown recrawl, but some do, for example a new address or phone number if such information is present on every page.

For example a blog has posts, pages, comments, categories and other data sources that can change the sidebar’s contents quite frequently. On a page that outputs a single post or page, the last-modified date is determined by the post, respectively its last comment. The main page’s last-modified date is the modified-timestamp of the most recent post, and the same goes for its paginated continuations. A category page’s last-modified date is determined by the category’s most recent post, and so on.

New posts can change outgoing links of older posts when you use plugins that list related posts and stuff like that. There are many more reasons why search engines should crawl older posts at least monthly or so. You might need a routine that changes a blog page’s last-modified timestamp for example when it is a date more than 30 days or so in the past. Also, in some cases it could make sense to have a routine that can reset all timestamps reported as last-modified date for particular site areas, or even the whole site.

If your software doesn’t populate last-modified attributes on changes of all entities, then snap at the chance to consider database triggers, stored procedures, respectively changes of your data access layer. Bear in mind that not all changes of a record must trigger a crawler cache reset. For example a table storing textual contents like articles or product descriptions usually has a number of attributes that don’t affect crawling, thus it should have an attribute last updated  that’s changeable in the UI and serves as last-modified date in your crawler cache control (instead of the timestamp that’s changed automatically even on minor updates of attributes which are meaningless for HTML outputs).

Handling Last-Modified, If-Modified-Since, and Not-Modified HTTP headers with PHP/Apache

Below I provide example PHP code I’ve thrown together after midnight in a sleepless night, doped with painkillers. It doesn’t run on a production system, but it should get you started. Adapt it to your needs and make sure you test your stuff intensively. As always, my stuff comes as is  without any guarantees. ;)

First grab a couple helpers and put them in an include file you’ve available in all scripts. Since we deal with HTTP headers, you must not output anything before the logic that deals with conditional search engine requests, not even a single white space character, HTML DOCTYPE declaration …
View|hide PHP code. (If you’ve disabled JavaScript you can’t grab the PHP source code!)

In general, all user agents should support conditional GET requests, not only search engine crawlers. If you allow long lasting caching, which is fine with search engines that don’t need to crawl your latest Twitter message from your blog’s sidebar, you could leave your visitors with somewhat outdated pages if you serve them 304-Not-Modified responses too.

It might be a good idea to limit 304 responses to conditional GET requests from crawlers, when you don’t implement way shorter caching cycles for other user agents. The latter includes folks that spoof their user agent name as well as scrapers trying to steal your stuff masked as a legit spider. To verify legit search engine crawlers that (should) support conditional GET requests (from Google, Yahoo, MSN and Ask) you can grab my crawler detection routines here. Include them as well, then you can code stuff like that:

$isSpiderUA = checkCrawlerUA ();
$isLegitSpider = checkCrawlerIP (__FILE__);
if ($isSpiderUA && !$isLegitSpider) {
@header("Thou shalt not spoof", TRUE, 403);
exit;
// make sure your 403-Forbidden ErrorDocument directive in
// .htaccess points to a page that explains the issue!
}
if ($isLegitSpider) {
// insert your code dealing with conditional GET requests
}

Now that you’re sure that the requestor is a legit crawler from a major search engine, look at the HTTP request header it has submitted to your Web server.

// lookup the HTTP request header for a possible conditional GET
$ifModifiedSinceTimestamp = getIfModifiedSince();
// if the request is not conditional, don’t send a 304
$canSend304 = FALSE;
if ($ifModifiedSinceTimestamp !== FALSE) {
$canSend304 = TRUE;

// Tells the requestor that you’ve recognized the conditional GET
$echoRequestHeader = "X-Requested-If-modified-since: "
.unixTimestamp2HttpDate($ifModifiedSinceTimestamp);
@header($echoRequestHeader, TRUE);
}

You don’t need to echo the If-Modified-Since HTTP-date in the response header, but this custom header makes testing easier.

Next get the page’s actual last-modified date/time. Here is an (incomplete) code sample for a WordPress single post page.

// select the requested post's comment_count, post_modified and
 // post_date values, then:
if ($wp_post_modified) {
$lastModified = date2UnixTimestamp($wp_post_modified);
}
else {
$lastModified = date2UnixTimestamp($wp_post_date);
}
if (intval($wp_comment_count) > 0) {
// select last comment from the WordPress database, then:
$lastCommentTimestamp = date2UnixTimestamp($wp_comment_date);
if ($lastCommentTimestamp > $lastModified) {
$lastModified = $lastCommentTimestamp;
}
}

The date2UnixTimestamp() function accepts MySQL datetime values as valid input. If you need to (re)write last-modified dates to a MySQL database, convert the Unix timestamps to MySQL datetime values with unixTimestamp2MySqlDatetime().

Your server’s clock isn’t necessarily synchronized with all search engines out there. To cover possible gaps you can use a last-modified timestamp that’s a little bit fresher than the actual last-modified date. In this example the timestamp reported to the crawler is last-modified + 13 hours, you can change the deviant in makeLastModifiedTimestamp().
$lastModifiedTimestamp = makeLastModifiedTimestamp($lastModified);

If you compare the timestamps later on, and the request isn’t conditional, don’t run into the 304 routine.
if ($ifModifiedSinceTimestamp === FALSE) {
// make things equal if the request isn't conditional
$ifModifiedSinceTimestamp = $lastModifiedTimestamp;
}

You may want to allow a full fetch if the requestor’s timestamp is ancient, in this example older than one month.
$tooOld = @strtotime("now") - (31 * 24 * 60 * 60);
if ($ifModifiedSinceTimestamp < $tooOld) {
$lastModifiedTimestamp = @strtotime("now");
$ifModifiedSinceTimestamp = @strtotime("now") - (1 * 24 * 60 * 60);
}

Setting the last-modified attribute to yesterday schedules the next full crawl after this fetch in 30 days (or later, depending on the actual crawl frequency).

Finally respond with 304-Not-Modified if the page wasn’t remarkably changed since the date/time given in the crawler’s If-Modified-Since header. Otherwise send a Last-Modified header with a 200 HTTP response code, allowing the crawler to fetch the page contents.
$lastModifiedHeader = "Last-Modified: " .unixTimestamp2HttpDate($lastModifiedTimestamp);
if ($lastModifiedTimestamp < $ifModifiedSinceTimestamp &&
$canSend304) {
@header($lastModifiedHeader, TRUE, 304);
exit;
}
else {
@header($lastModifiedHeader, TRUE);
}

When you’re testing your version of this script with a browser, it will send a standard HTTP request, and your server will return a 200-OK. From your server’s response your browser should recognize the “Last-Modified” header, so when you reload the page the browser should send an “If-Modified-Since” header and you should get the 304 response code if Last-Modified > If-Modified-Since. However, judging from my experience such browser based tests of crawler behavior, respectively responses to crawler requests, aren’t reliable.

Test it with this MS tool instead. I’ve played with it for a while and it works great. With the PHP code above I’ve created a 200/304 test page
http://sebastians-pamphlets.com/tools/last-modified-yesterday.php
that sends a “Last-Modified: Yesterday” response header, and should return a 304-Not Modified HTTP response code when you request it with an “If-Modified-Since: Today+” header, otherwise it should respond with 200-OK (this version returns 200-OK only but tells when it would  respond with a 304). You can use this URI with the MS-tool linked above to test HTTP requests with different If-Modified-Since headers.

Have fun and paypal me 50% of your savings. ;)



Share/bookmark this: del.icio.usGooglema.gnoliaMixxNetscaperedditSphinnSquidooStumbleUponYahoo MyWeb
Subscribe to      Entries Entries      Comments Comments      All Comments All Comments
 

12 Comments to "Save bandwidth costs: Dynamic pages can support If-Modified-Since too"

  1. Sebastian on 19 February, 2008  #link

    How could I miss out on these links from John Müller …

    JohnMu: @SebastianX check out http://oyoy.eu/huh/304-php/, http://oyoy.eu/huh/304-classic-asp/ and http://oyoy.eu/page/ifmodified/ (old stuff :-))

    Such oldies are goodies ;)

  2. Colin Cochrane on 19 February, 2008  #link

    Great post Sebastian. This is an invaluable resource for anyone who is serious about maximizing the performance of their server.

  3. Icheb on 19 February, 2008  #link

    When will people finally realize that bandwidth != traffic?

    BANDWIDTH is the SPEED something is transferred at.

    TRAFFIC is the AMOUNT of data that has been transferred.

    So when you say shit like “Bandwidth used: 5 kilobytes”, you really mean TRAFFIC.

  4. Sebastian on 20 February, 2008  #link

    Thanks Colin. :)
    Icheb, of course that’s traffic, but everybody uses the other term so I do it too for compatibility reasons. Also, some hosts charge bandwidth, not traffic/gigs transferred.

  5. SearchCap: The Day In Search, February 21, 2008…

    Below is what happened in search today, as reported on Search Engine Land and from other places across the web…….

  6. Utah SEO Pro on 23 February, 2008  #link

    Leave it to Sebastian to bust out some hella complex code to solve problems I’ve never even would have though of. Kick ass!

  7. Sebastian on 24 February, 2008  #link

    Thanks Jordan. Actually, the code isn’t complex.

  8. […] Save bandwidth costs: Dynamic pages can support If-Modified-Since too […]

  9. Craig on 5 March, 2008  #link

    “Bandwidth” has a number of different definitions depending on the context.

    It’s original definition had nothing to do with traffic or bytes at all but instead, originated in the context of broadcasting, i.e. the amount of spectrum available to each communications licensee.

    Since then and in the context of Internet traffic, bandwidth can have a similar meaning, an available range or it can mean a range actually used or it can even mean the amount of data transfered over a given range.

    Very few words in the English language have one and only one meaning so context is everything. ;-)
    Craig

  10. angelvoyagera on 29 June, 2009  #link

    it dont work email me a notice

  11. […] more than HTTP response codes doable with creative headers. For example cost savings. In some cases a single line of text in an HTTP header tells the user agent more than a bunch of […]

  12. Austin on 23 October, 2009  #link

    Pure genius - I’ve cut my “bandwidth” usage in half using this. Time to implement on the rest of my sites!

Leave a reply


[If you don't do the math, or the answer is wrong, you'd better have saved your comment before hitting submit. Here is why.]

Be nice and feel free to link out when a link adds value to your comment. More in my comment policy.