Having trouble blocking google spider
Posted by lhmx, 11-15-2012, 02:51 AM I'm having some trouble blocking google (and others) from indexing my site. I created a robots.txt file in the root directory that contains my site's files with: User-agent: * Disallow: / Yet Google continues to index my pages. Any ideas what is going on?
Posted by Dave Parish, 11-15-2012, 03:12 AM What do the permissions on the file look like?
Posted by Dr_Michael, 11-15-2012, 04:06 AM Try to login to Google Webmaster Tools and setup there the frequency of the google bot visit. You can configure it to the lowest possible speed, etc.
Posted by zoid, 11-15-2012, 04:13 AM It can take some time until it takes effect.
Posted by JayWard_HSW, 11-15-2012, 01:00 PM Dr. Michael is right, WebMaster Tools is the way to go here. You can have it look at your site the way the bot does, to be sure the permissions and what not are correct in your robots.txt file. Plus they offer a lot of other insights into your site that are invaluable.
Posted by lhmx, 11-16-2012, 03:50 PM Figured out a better way by just using meta tags. Stopped everything immediately.