How to Optimize Your WordPress Robots.txt File

Have you ever heard of the Robots.txt file? If you’re familiar with WordPress, you might know this file because it has an immense impact on the SEO performance of your site. A well-optimized Robots.txt file can improve the search engine rankings to a huge level. On the other hand, an incorrectly configured one would have a serious negative impact on the SEO of your site.

WordPress automatically generates a Robots.txt file for your website, but still, you need to take some actions to properly optimize it. There are so many other factors for SEO, but this file is inevitable. Since its editing involves using some line of codes, most of the website owners hesitate to make changes in it. You don’t need to worry. Today’s article covers its importance and you can learn how to optimize WordPress robots.txt file for better SEO.

Before moving further let’s learn some fundamental points.

What is Robots.txt File?

Robots.txt is a text file that is created to guide search engine bots where to crawl and not crawl on your site. First, search engine bots follow the instructions in your Robots.txt file when they come to your site. They ‘re therefore crawling and indexing your site. With the help of this file, you can direct the search engine bots to which areas of your site should crawl and which areas should not crawl.

Usually, you can locate this file from your WordPress root folder. Since it is like a usual text file, using a text editor like Notepad you can open it.

If there is no such file, you just need to create a notepad file and save it as Robots.txt and using FTP client, upload it to the root folder of your site.

How to Edit WordPress Robots.txt File

Editing Robots.txt file can be done either from your Server’s FTP account or using WordPress plugin. I suggest you use a plugin for this purpose as it is very simple. Then the best plugin will be Yoast SEO to edit your Robots.txt file from WordPress Dashboard.

I am assuming you already have this plugin. Well, go to SEO > Tools from your WordPress Dashboard.

Next step is to click on File Editor that allows you to edit Robot.txt as well as the .htaccess file.

As I mentioned earlier, WordPress automatically creates a Robots.txt file for your site as you can see from the below screenshot.

Let’s learn about the commands.

  • User-agent – Specify which all search engine bots have to index your site. In order to specify every search engine bots, add User-agent: * command.
  • Disallow – Block any search engine bots from indexing some parts of your website.
  • Allow – Direct search engine bots that which parts of the website have to be crawled and indexed.

If you don’t find any Robots.txt file, just click on the create button to create a new one.

How to Optimize WordPress Robots.txt File For Better SEO

As per Google’s Webmaster Guideline, don’t use Robots.txt file to hide your low-quality content. Using Robots.txt file to stop search engines from indexing your category, tags, date, archives etc may not be a good idea. But with the help of Yoast SEO plugin, you will be able to use noindex and nofollow meta tag to the content that you don’t want to get indexed. It is not a good practice to use the Robots file to stop search engine bots from indexing duplicate content. It can be dealt with a different method.

So that try to avoid the above mentioned bad practices to optimize your WordPress Robots.txt file more search engine friendly.

Now it’s time to follow some good practices.

Through this, you can allow and disallow search engine bots to crawl and index. I do recommend, disallow the readme.html file, plugin directory, admin and trackback. Including your sitemaps to Robots.txt file is a good choice that boosts up the indexing of pages by search engine bots.

Here’s a sample

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-content/plugins/
Disallow: /readme.html
Disallow: /trackback/
Disallow: /go/
Allow: /wp-admin/admin-ajax.php
Allow: /wp-content/uploads/
Sitemap: https://yourdomain.com/post-sitemap.xml
Sitemap: https://yourdomain.com/page-sitemap.xml

Testing WordPress Robots.txt File

Next task is to check whether the Robots.txt file is updated properly or not. The best ways to use Google Search Console to test your Robots.txt file.

Well, open your Google Search Console account and click on the site. From the left side of your screen, select Crawl > robots.txt Tester and click on the Submit button.

After that, a pop-up box appears on your screen and simply click on the Submit button to let Google know your Robots.txt file has been updated.

Next, you need to reload the page to check whether it has been updated or not. It may take time to get the updates.

Otherwise, you can paste your file content into the test box if it has not been updated.

You need to edit your Robots.txt file properly if you see errors and warnings.

Last Words

This is it. You know how important SEO is to improve the performance of your website. This ensures that all aspects of your SEO site are accurate and perfect. You need to pay more attention while editing your Robots.txt file as the wrong configuration can stop the Search Engine Bots from indexing your entire website.

Do share your comments below about how to optimize WordPress robots.txt file for better SEO. Please follow me on my social media platforms to notify you of awesome articles.

Add a Comment

Your email address will not be published. Required fields are marked *

ABOUT CODINGACE

My name is Nohman Habib and I am a web developer with over 10 years of experience, programming in Joomla, Wordpress, WHMCS, vTiger and Hybrid Apps. My plan to start codingace.com is to share my experience and expertise with others. Here my basic area of focus is to post tutorials primarily on Joomla development, HTML5, CSS3 and PHP.

Nohman Habib

CEO: codingace.com

Request a Quote