From the course: Technical WordPress SEO

Unlock this course with a free trial

Join today to access over 23,000 courses taught by industry experts.

Creating and testing a robots.txt file

Creating and testing a robots.txt file - WordPress Tutorial

From the course: Technical WordPress SEO

Creating and testing a robots.txt file

Did you know that you have the power to block access to a crawler on any page on your site with a robots.txt file? You can control which URLs crawlers have access to block the URLs you don't want searchers to see and block any unwanted bot from your site. In this lesson, you'll learn how robots.txt files work and how to create and test them on your own site. A robots.txt is a text file with rules that block or allow access for crawlers on specific parts of your site. All of the site files are allowed for crawling unless you specify otherwise in your robots.txt file. Some types of pages you wouldn't want crawlers to access are login pages, user accounts or tracking pages, product filters and variations, staging or developing environments, internal search results page, and paginated or category pages. It all depends on your site size and structure. Let's take a look and how you can create and test a robots.txt file. First, you need to check if there's already a robots.txt file in your…

Contents