I run a moderate sized gaming community. However there are some people using Googlebot to attack my website with 2hrmedical and things of that nature. Is there a way to block googlebot from downloading page information without blocking the googlebot from being able to scan my website? this way they cant use googlebot as a proxy. So I can figure out what country they are in and ban that subnet mask-question by RaxinII
Googlebot is not an bot for attacks. Googlebot is crawler which checks for your site and later adds in the Google database with the content.Also , .htaccess can't do nothing with it. You can only block IP ranges with it , and Googlebot has many IPs. You could block Googlebot with robots.txt (you can google it) , but there is no reason for doing that.By blocking Googlebot , Google will stop crawling your site and when somebody will search , he will not find your website.People might attack you by DDoS or DoS attacks ( http://en.wikipedia.org/wiki/DDoS ) which request your site too much times and can kill your bandwidth and slow down your website.But , hostings have powerful options and features to block this attacks, so you shouldn't worry.So i don't believe somebody attacks your site actually.
You could redirect the bots somewhere else or deny them access.Create a plain txt file called: Robots.txtYou will upload this file to your root directory.Add these contents:User-agent: *Disallow: /This will block everything from everything.If you want to specify blocking certain files or folders:User-agent: * Disallow: /admin/ Disallow: /db/ Disallow: admin.php Disallow: config.php Disallow: mainfile.php Disallow: /administration/This should prevent bots from accessing your files.If this is not what you were looking for, as advised from the poster above.. then I suggest you simply get some extra protection for your site.Visit http://www.crawltrack.net/ for more information on advanced protection.