I'm trying to run a link checker prior to going live next week. I keep getting an Access Denied by Robots.txt error. I had my robots.txt file set up to block everything while I was working on it. I just recently changed it back to the defaults:
Sitemap: http://garlicfestival.3dcartstores.com/sitemap.xml
# Disallow all crawlers access to certain pages.
User-agent: *
Disallow: /checkout.asp
Disallow: /add_cart.asp
Disallow: /view_cart.asp
Disallow: /error.asp
Disallow: /shipquote.asp
Disallow: /rssfeed.asp
Disallow: /mobile/
Anyone know why it is still getting blocked? Does it take a while for the new robots.txt to work?
Sitemap: http://garlicfestival.3dcartstores.com/sitemap.xml
# Disallow all crawlers access to certain pages.
User-agent: *
Disallow: /checkout.asp
Disallow: /add_cart.asp
Disallow: /view_cart.asp
Disallow: /error.asp
Disallow: /shipquote.asp
Disallow: /rssfeed.asp
Disallow: /mobile/
Anyone know why it is still getting blocked? Does it take a while for the new robots.txt to work?
Comment