Url blocked by robots.txt.

  • jmk909er


    Hi I installed the plugin: “Google XML Sitemaps” and am getting set up in Google Webmaster Tools. When I enter the sitemap and test it it returns an error saying: “Url blocked by robots.txt. Sitemap contains urls which are blocked by robots.txt.”

    Is this a setting in the Graphene theme that I need to do something with?


    Kenneth John Odle


    No, it’s in WordPress itself.

    Settings >> Reading

    Make sure the box labeled “Discourage search engines from indexing this site” is NOT ticked.



    Why do you want a robots.txt?

    This is working with Goggle:

    Allow: /



    Hi, I did have “Discourage search engines from indexing this site” ticked while I was building my site but I un-ticked it before I set up anything with Google. I just checked it again and it is still unchecked. I don’t really know what else to do. Do you have any suggestions?


    Kenneth John Odle


    FTP to your site, download the robots.txt file, and make sure it was actually changed. If not, you’ll have to edit it manually and reupload it.

    It could also be caught in the server cache. Make sure to empty your server cache if you are using one, to make sure Webmaster tools is being served up a fresh copy of it.

    If nothing else, just FTP and delete it.



    Thanks Kenneth but your talking way over my head, I’m not a programmer. Where is this robots.txt file located? Is it in edetor?

    I also have FTP but where would I look for it? What folder is it in?



    Hey Kenneth, I just checked Google Webmaster again and the error has cleared! I didn’t do anything, I guess it just took time to clear the server??? Thanks for your help. This is resolved.

Viewing 7 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic.