Search Engines Indexing Core Folders
Permalink 1 user found helpful
We have a site being index by Google and we noticed it's indexing core folders like:
/concrete/libraries
/files/onstates
/concrete/jobs
/concrete/config
See for yourself:
http://www.google.com/search?hl=en&q=site:gigstadpainting.com...
This is no good. Any ideas on how Concrete could block that indexing rather than me setting up a custom robots file?
/concrete/libraries
/files/onstates
/concrete/jobs
/concrete/config
See for yourself:
http://www.google.com/search?hl=en&q=site:gigstadpainting.com...
This is no good. Any ideas on how Concrete could block that indexing rather than me setting up a custom robots file?
That seemed to work, but then I checked the Pretty URL functionality and that was spitting out 404's.
I added this to httpd.conf
<Directory "/home/*">
Options -Includes
AllowOverride None
</Directory>
So server.com/concrete was blocked, but so was server.com/services (an actual page, after prettified by pretty URL's)
Thoughts?
I added this to httpd.conf
<Directory "/home/*">
Options -Includes
AllowOverride None
</Directory>
So server.com/concrete was blocked, but so was server.com/services (an actual page, after prettified by pretty URL's)
Thoughts?
Thanks Andrew, I resolved it by learning more about editing httpd.conf
Go figure.
Go figure.
A robot.txt file wouldn't hurt either.
Now, there's nothing an end user should be able to do with that knowledge...but it's still probably not the best server setup (and plus, I think this is why you're being indexed.)
I'd disable directory browsing through apache yourself or have an administrator do it... that way going to a directory that doesn't have a valid index.php file in it is going to get permissions denied...and I think google will stop spidering those directories then:
http://felipecruz.com/blog_disable-directory-listing-browsing-apach...