Google Webmasters Sitemap submission says all pages blocked by robots.txt but tests as "Allowed" when testing...
Permalink
http://www.chicodesigncenter.com
Sitemap submission gives 18 warning, the site only has 18 pages, and says: "Warnings
Url blocked by robots.txt. Sitemap contains urls which are blocked by robots.txt.". The Robots.txt file test fine for all pages and shows ZERO errors. Has anyone else encountered this?
This is all that is in the file and none of it blocks pages:
Sitemap submission gives 18 warning, the site only has 18 pages, and says: "Warnings
Url blocked by robots.txt. Sitemap contains urls which are blocked by robots.txt.". The Robots.txt file test fine for all pages and shows ZERO errors. Has anyone else encountered this?
This is all that is in the file and none of it blocks pages:
User-agent: * Disallow: /application/attributes Disallow: /application/authentication Disallow: /application/bootstrap Disallow: /application/config Disallow: /application/controllers Disallow: /application/elements Disallow: /application/helpers Disallow: /application/jobs Disallow: /application/languages Disallow: /application/mail Disallow: /application/models Disallow: /application/page_types Disallow: /application/single_pages Disallow: /application/tools
Viewing 15 lines of 16 lines. View entire code block.