Optimize your Robots.txt WordPress

Every WordPress site must have a robots.txt file.

That file tells to the search engine robots to index or not index some parts of your site.

In a WordPress site we want to index all post but we don't want to index the admin section (wp-admin) or core files (wp-includes).

So How to optimize your robots.txt file in WordPress?

1. Create the robots.txt file

In your root installation folder / if you don't have it already.

2. Fill the robots.txt file with this lines

Please notice the last line is the URL to your XML Sitemap. Create it if you don't have it.

User-agent: *
Disallow: /wp-includes/
Disallow: /wp-admin/

Sitemap: https://ricard.dev/sitemap.xml.gzCode language: HTTP (http)

This lines will allow to index all your site but will disallow to index the wp-includes and the wp-admin folders.

Some WordPress developers also block the wp-content folder. It's up to you to index that folder.

Here is my robots.txt https://ricard.dev/robots.txt

Do you have a robots.txt file?

Comments

  1. Hi Quicoto, I was gladly surprised to find that you’ve become a web developer, I’m a fan of your photography blog πŸ™‚ and it’s nice to see that you’ve chosen to walk this path.

    I use a robots.txt file a little bit more complex; it’s nothing but restrictions to google so it doesn’t index dupe content (pagination links, author and category pagination too). You’re free to check it out anytime you want πŸ™‚

    Hope that helps.

      1. Hi,

        I’m not an expert, not even close to that but I love web design/development. I’m a huge fan of blogs like this and be sure that I will be checking out new content very often πŸ™‚

        Thanks for the kind words.
        Regards

Leave a Reply

Your email address will not be published. Required fields are marked *