An email I received earlier this evening, from SEVENtwentyfour Inc.:

There appears to be a problem on this page of your site.

On page http://www.marteydodoo.com/2005/07/31/free-lgpl-icons/ when you click on "via Kevin Walzer?s now-defunct blog", the link to http://www.kevin-walzer.com/pivot/entry.php?id=71 gives the error: Domain name lookup failed (may be a transient error).

It is extremely obvious that this "problem" was detected by a web crawling robot, since I clearly noted in my post that the link did not work. I only referenced it because an endless line of teachers in elementary and high school drilled the concepts of proper citation into me.

While complaining about a hyperlink that I clearly already knew was broken was bad enough, the sheer lack of effect put into the endeavor simply heightens my annoyance. Would it take too much time to get the title of the page instead of inserting the URL? "On page http://www.example.com"? Who writes like that? And "Walzer?s"? It's called Unicode, fool.

As recommended by the Robot Guidelines, this email is to explain our robot's [sic] visit to your site, and to let you know about one of the problems we found. We don't store or publish the content of your pages, but rather use the link information to update our map of the World Wide Web.

While this is a nice attempt at legitimization, it is also a complete fabrication. The Guidelines for Robot Writers only provide one instance in which robot programmers should use email to contact website administrators - if their robot is only crawling a few websites. Since SEVENTwentyFour is attempting to create a "map of the World Wide Web," this clearly does not apply to them.

Interestingly enough, the Guidelines for Robot Writers suggest that robots programmers include an email address in their user agent field, so that website administrators can contact them in case of problems. SEVENTwentyFour does not do this.

Are these reports helpful? I'd love some feedback. If you prefer
not to receive these occasional error notices please let me know.

Roy Bryant

If I felt a need for a program to nag me about mistakes it assumed I had made, I could build my own Nagbot program to do so. While I have no problem receiving error messages about actual broken links, I get enough spam each day without having to deal with it from the owners of shady Internet companies who suck up my bandwidth in order to hawk their "superior link checking" technology when they would clearly have been better off pursuing spelling & grammar checking.

grep seventwentyfour access.log
209.167.50.22 - - [02/Sep/2005:01:38:17 -0700] "GET /robots.txt HTTP/1.1" 200 727 "www.seventwentyfour.com/" "LinkWalker"
echo "Deny from 209.167.50.22" >> ~/marteydodoo.com/.htaccess