Google crawl errors

Permalink
I'm seeing a lot of crawl errors from Google, when it tries to get into places like this:
http://creditreporthowto.us/updates/concrete5.4.2.1/concrete/themes...

I'm not sure where it's finding an entry point intohttp://creditreporthowto.us/updates,... buy I think having the robots.txt file exclude the /updates directory will address the problem.

It would be great if the robots.txt file generated by the base product could include this.

 
TheRealSean replied on at Permalink Reply
TheRealSean
not to sure about the current version? but I know its in the alpha version robots.txt

Disallow: /updates

This website stores cookies on your computer. These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media. To find out more about the cookies we use, see our Privacy Policy.