This is a fast step tutorial. I will describe how to identify and get rid of web spiders/crawlers. Whats is a bot and what it does, how it functions etc all can be found here So you are trouble with robots, good or bad does not matter. They all leech away your bandwith and resources and just maybe do something for you in return. Even though they are not harvasting or spammer bots. This problem goes beyond bandwidth when you have like 100,000 dynamic pages under one server. So how do we seperate them as good or bad?