Even on domains where I want everything indexed, I still include a blank robots.txt. Why? Because I hate to have my error logs cluttered up with unnecessary 404 entries.
Its also fun to use them to watch for malicious spiders. If you deny a directory in your robots.txt that nobody would have any legitimate reason to access, you can tell that any requests hitting that directory
are up to no good.
__________________
<!--<font size="1">"Did you ever hear anyone say 'that work had better be banned because I might read it and it might be very damaging to me'?"
Joseph Henry Jackson,
American Journalist, 1894-1946</font>-->
<font size="1">"I know you believe you understand what you think I said, but I am not sure you realize that what you heard is not what I meant."<br>--Richard Nixon</font>
|