You know that your WordPress website should have a robots.txt file, but what should you put in it? In this quick tutorial, I’ll give you my best recommendation for a standard robots.txt file in WordPress that’s optimized for SEO.
Recommended WordPress Robots.txt
For most WordPress websites, the following robots.txt file should suffice. Simply copy and paste these lines of code into a file named robots.txt in the root of your website.
Note: Please don’t forget to change the URL in the Sitemap lines to the URL of your website, and make sure sitemaps exist at those location. If you are using the Yoast SEO plugin, these sitemaps should exist.
User-Agent: * Allow: /wp-content/uploads/ Disallow: /wp-content/plugins/ Disallow: /wp-admin/ Disallow: /readme.html Sitemap: https://tonyteaches.tech/post-sitemap.xml Sitemap: https://tonyteaches.tech/page-sitemap.xml
For those of you who care, here is an explanation of each setting in the example sitemap above.
- User-Agent – With an asterisk allows all bots to crawl your website
- Allow – Bots are explicitly allowed to access URLs with this path
- Disallow – Bots are not allowed to access URLs with this path
- Sitemap – The location of a sitemap
For more information on robots.txt standards and best practices, see Google’s documentation on this topic. Otherwise, if you have any question, please let me know in the comments below.