WordPress has saved time and efforts for webmasters, as it automatically creates a Robots.txt file after installation. But the question: How to optimize WordPress Robots.txt file for SEO? And How to how to use robots.txt for SEO?
If you were neglecting that file, then it’s time to take care of it. In simple words, Robots.txt file is telling search engine bots which pages to crawl and what pages to avoid.
That being said, It’s crucial to optimize WordPress Robots.txt file for SEO and make it ideal. This way you’ll guarantee that search engines are reading your mind and applying your preferences.
What is Robots.txt?
Robots.txt is an important file located in the main directory of your WordPress installation. It simply contain clear instructions for Search Engine bots regarding which pages to crawl and index in search pages.
It’s not only dedicated to Google bots but also works with any other search engine bots. It’s important to have such file for a proper SEO setup of your site.
One thing: Not having a Robots.txt will not stop search engines from crawling your pages, but the file keep your results neat and organized to search your SEO goals.
How to Create a Robots.txt WordPress file?
Like I said: After installing WordPress on your site, Robots.txt will be automatically created with the default parameters. You can check it via http://www.yourdomain.com/robots.txt.
For a better understanding of the Robots.txt file, I would ask you to check other popular domains and WordPress sites to see how their robots.txt is looking like. For example; You can check our file via:https://www.grow.cheap/robots.txt.
If you don’t have one, you can easily create it. Go to your Web Hosting Dashboard > File Manager > Create New File. In the pop-up box, Enter the name of the newly created file “Robots.txt“. Check the screenshot below:
How to Optimize WordPress Robots.txt File?
There are different ways to optimize WordPress Robots.txt file for SEO. I would the best way is to use Yoast SEO plugin. Basically, this is the most recommended SEO plugin for WordPress.
Alternatively, if you don’t prefer to use Yoast SEO, You can use WP Robots Txt plugin. It will allow you to edit the Robots.txt file from the admin dashboard.
Now, Let me show you how to edit Robots.txt in WordPress:
To edit Robots.txt using Yoast SEO: After installing and activating the plugin, Go to WordPress Dashboard > SEO > Tools. Then click on File Editor. See the screenshot below:
Hence you don’t yet have a Robots.txt file, You will need to click on “Create Robots.txt file“.
It will open the file editor to create and edit the Robots.txt file for your WordPress site.
Feel free to edit and configure your Robots.txt file the way you want. After completing the configurations, click on “Save Changes to Robots.txt“.
Here, I want to show you how to understand the important commands of the Robots.txt file. Mainly, there are three commands to keep in mind:
- User-agent: It refers to the search engine bot. For example; As for Google, use Googlebot. I would say it’s better to use an asterisk (*) to refer to all search engines instead.
- Allow: This command is basically telling search engines which pages and directories to crawl. Here you can define which parts of your site to be indexed.
- Disallow: This is the opposite command to Allow. It’s basically telling search engines which pages to crawl and index based on your preferences.
Here is an example of the WordPress default Robots.txt file:
User-agent: * Disallow: /wp-admin/ Allow: /
The above-mentioned example file instructs all search engines in order not to crawl /wp-admin/ directory. And to crawl any other pages on my site.
Here is another example of the ideal Robots.txt file:
User-agent: * Disallow: /wp-admin/ Disallow: /wp-content/plugins/ Disallow: /readme.html Disallow: /trackback/ Disallow: /recommends/ Allow: /wp-admin/admin-ajax.php Allow: /wp-content/uploads/ Sitemap: https://www.grow.cheap/post-sitemap.xml Sitemap: https://www.grow.cheap/page-sitemap.xml
Verifying Robots.txt File in Google Search Console:
We’re doing this extra step to confirm and verify that the recent changes on the Robots.txt file are not reporting any errors or warnings. We’re here using Google Search Console.
You will need to login to your account on Google Search Console. Then, go to Crawl > Robots.txt Tester.
A little box will pop up asking you about the specific action your want to proceed with. Choose “Ask Google to update” option, and hit “Submit“.
Then go back to Crawl > Robots.txt Tester page. Then, refresh the page to verify that it has been updated on Google Search Console records successfully.
In case you’ve found any errors or warnings reported, you will have to go back to your Robots.txt and fix these reported issues right away.
Now, you’ve got a detailed idea about how to optimize WordPress Robots.txt file for better SEO and How to how to use robots.txt for SEO. Let me know in comments below shall you require any further assistance or help. I would love to help you further.
If you find this post helpful, Don’t forget to share it on Facebook, Twitter or Google+.