How to Optimize Your WordPress Robots.txt File

Have you ever heard of the Robots.txt file? If you’re familiar with WordPress, you might know this file because it has an immense impact on the SEO performance of your site. A well-optimized Robots.txt file can improve the search engine rankings to a huge level. On the other hand, an incorrectly configured one would have a serious negative impact on the SEO of your site.

WordPress automatically generates a Robots.txt file for your website, but still, you need to take some actions to properly optimize it. There are so many other factors for SEO, but this file is inevitable. Since its editing involves using some lines of codes, most website owners hesitate to make changes to it. You don’t need to worry. Today’s article covers its importance and you can learn how to optimize the WordPress robots.txt file for better SEO.

Before moving further, let’s learn some fundamental points.

What is Robots.txt File?

Robots.txt is a text file that is created to guide search engine bots where to crawl and not crawl on your site. First, search engine bots follow the instructions in your Robots.txt file when they come to your site. They are therefore crawling and indexing your site. With the help of this file, you can direct the search engine bots to which areas of your site they should crawl and which areas they should not crawl.

Usually, you can locate this file from your WordPress root folder. Since it is like a normal text file, using a text editor like Notepad, you can open it.

If there is no such file, you just need to create a notepad file and save it as Robots.txt and, using an FTP client, upload it to the root folder of your site.

How to Edit WordPress Robots.txt File

Editing the Robots.txt file can be done either from your server’s FTP account or using the WordPress plugin. I suggest you use a plugin for this purpose as it is very simple.

Then the best plugin would be Yoast SEO to edit your Robots.txt file from the WordPress Dashboard.

I am assuming you already have this plugin. Well, go to SEO > Tools from your WordPress Dashboard.

The next step is to click on the File Editor that allows you to edit Robot.txt as well as the.htaccess file.

As I mentioned earlier, WordPress automatically creates a Robots.txt file for your site, as you can see from the below screenshot.

Let’s learn about the commands.

  • User-agent – Specify which all search engine bots have to index your site. In order to specify every search engine bot, add the User-agent: * command.
  • Disallow – Block any search engine bots from indexing some parts of your website.
  • Allow – Direct search engine bots to which parts of the website have to be crawled and indexed.

If you don’t find any Robots.txt files, just click on the create button to create a new one.

How to Optimize WordPress Robots.txt File for Better SEO

As per Google’s Webmaster Guidelines, don’t use the Robots.txt file to hide your low-quality content. Using Robots.txt file to stop search engines from indexing your categories, tags, dates, archives etc may not be a good idea. But with the help of the Yoast SEO plugin, you will be able to use noindex and nofollow meta tags for the content that you don’t want to get indexed. It is not a good practice to use the Robots file to stop search engine bots from indexing duplicate content. It can be dealt with by a different method.

So, try to avoid the above mentioned bad practices to make your WordPress Robots.txt file more search engine friendly.

Now it’s time to follow some good practices.

Through this, you can allow and disallow search engine bots to crawl and index. I do recommend disallowing the readme.html file, plugin directory, admin and trackback. Including your sitemaps in the Robots.txt file is a good choice that boosts the indexing of pages by search engine bots.

Here’s a sample

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-content/plugins/
Disallow: /readme.html
Disallow: /trackback/
Disallow: /go/
Allow: /wp-admin/admin-ajax.php
Allow: /wp-content/uploads/
Sitemap: https://yourdomain.com/post-sitemap.xml
Sitemap: https://yourdomain.com/page-sitemap.xml

The “User-agent: *” means this section applies to all robots, including search bots from Google, Yahoo, MSN, and so on, should use those instructions to crawl your website. “Disallow:/” tells the robot that it should not visit any pages on the site. This will tell the bots not to crawl the above mentioned folders.

The robot.txt file will instruct search engine robots about what pages of your website should be crawled and indexed. Virtually all websites have folders and files that shouldn’t be crawled or indexed by search engines. Most of these folders are not relevant to search engines (like admin files, images). Therefore, using a robot.txt file can help improve your website’s indexation by preventing robots from indexing unnecessary pages and folders.

The reason why some blogs or websites lose traffic and rankings is because the structure of their robots files is wrong. Apart from this causing Google penalty, it is a relatively unknown reason why some people lose their SERP.

Testing WordPress Robots.txt File

Next task is to check whether the Robots.txt file is updated properly or not. The best ways to use Google Search Console to test your Robots.txt file.

Well, open your Google Search Console account and click on the site. From the left side of your screen, select Crawl > robots.txt Tester and click on the Submit button.

After that, a pop-up box appears on your screen and you simply click on the Submit button to let Google know your Robots.txt file has been updated.

Next, you need to reload the page to check whether it has been updated or not. It may take time to get the updates.

Otherwise, you can paste your file content into the test box if it has not been updated.

You need to edit your Robots.txt file properly if you see errors and warnings.

Last Words

This is it. You know how important SEO is to improving the performance of your website. This ensures that all aspects of your SEO site are accurate and perfect. You need to pay more attention while editing your Robots.txt file, as the wrong configuration can stop the Search Engine Bots from indexing your entire website.

While the use of robot files is logically and SEO approved and okay to use, its overuse will hurt your rankings and make Google decrease your rankings. Use it if there is a need. However, make use of noindex tags as a possible alternative. The use of the NOINDEX tag is the standard and preferred SEO practice.

Do share your comments below about how to optimize the WordPress robots.txt file for better SEO. Please follow me on my social media platforms to notify you of awesome articles.

Add a Comment

Your email address will not be published. Required fields are marked *

ABOUT CODINGACE

My name is Nohman Habib and I am a web developer with over 10 years of experience, programming in Joomla, Wordpress, WHMCS, vTiger and Hybrid Apps. My plan to start codingace.com is to share my experience and expertise with others. Here my basic area of focus is to post tutorials primarily on Joomla development, HTML5, CSS3 and PHP.

Nohman Habib

CEO: codingace.com

Request a Quote









PHP Code Snippets Powered By : XYZScripts.com