In this post, you will learn what a robots.txt file is and how to create it for your WordPress site or blog.
“robots.txt” is a .txt file that is must be created at the root of a website to tell search engine bots how to crawl and index pages on their site.
In this post, I’m not going to teach you how to create the robots.txt file rules. But for WordPress sites, I recommend to use the following rules in the robots.txt file.
Just change the Sitemap URL to your Sitemap URL.
User-Agent: * Allow: /wp-content/uploads/ Allow: /wp-admin/admin-ajax.php Disallow: /wp-admin/ Disallow: /wp-content/plugins/ Disallow: /readme.html Disallow: /refer/ Sitemap: http://www.yourdomain.com/your-sitemap.xml
Explanation of the above robots.txt rules
User-Agent: *=> All search engines are allowed to crawl the site according to the robots.txt rules.
Allow: [URL or Path]=> Search engine crawlers are allowed to crawl this URL or Path.
- Disallow: [URL or Path] => Search engine crawlers are not allowed to crawl this URL or path.
Sitemap: [URL]=> Tells the Search engine crawlers, this is the location of the sitemap of this site.
Create a robots.txt file in your WordPress blog or site
Create robots.txt through the Yoast SEO WordPress Plugin
If you are a Yoast SEO user, then it quite easy to create robots.txt for your WordPress blog.
Create a robots.txt file in WordPress
- Login to your dasboard » go to the Yoast SEO Tools menu.
- Click on the File editor.
If you do not see the File editor option, which means your file editing option is disabled. If you are using any security plugin such as iThemes security, then first disable it and check again that the file editor option is visible or not.
- Now click on the Create robots.txt file.
- Add your robots.txt rules and save it.
- Check your robots.txt file.
Now open your domain with the /robots.txt endpoint. https://www.yourdomain.com/robots.txt
Create robots.txt through the All in One SEO Plugin
If you are a All in One SEO Plugin user, then follow the below steps –
Go to the All in One SEO > Tools menu.
By default you are in the Robots.txt editor.
Scroll down, and Enable the Custom Robots.txt > add your rules > Save changes.