How to Set Up A WordPress Robots.txt Plug In
The Robots.txt file
In my video series about the Google Search Engine Optimization Starter Guide, I cover a section on the Robots.txt file.
Basically, this file is used to tell search engine robots not to crawl a particular page or post on a site.
This is a way to streamline the crawling process and avoid having content that is not current or is not in need of indexing from having to be crawled.
It’s a way to avoid having duplicate content from being indexed, though you can use 301 and 302 redirects to avoid the duplicate content issue as well.
Webmasters have the ability to determine what portions of their websites get crawled and indexed.
The file also has the function of showing search engines where they can find the sitemaps for a site. And they can also determine the speed with which a search engine can crawl a server.
One note though; some search engines may ignore the Robots.txt file and crawl the information anyway. This is true especially of some less reputable search engine crawlers.
So if you want to protect information from being crawled, you may want to store it elsewhere.
How to implement
To implement a Robots.txt file in a self hosted WordPress website, all you need to do is to your plugin page, select add new and enter Robots.txt.
When the search results page appears, select the plugin of your choice and click install, then activate and your plugin should appear in your plugin list and in the drop down menu under settings.
Click on it and go into the configuration page and check and see if you want to customize the settings.
Click save and you are all done.
The video below takes you through the process of selecting and configuring your Robots.txt implementation, so have a look and go from there.
I hope this information is helpful.
Stay with it, stay well and may your travels be prosperous.