Magento Commerce, 2.2.x

Search Engine Robots

The Magento configuration includes settings to generate and manage instructions for web crawlers and bots that index your site. The instructions are saved in a file called “robots.txtA file placed on a website that tells search engine crawlers which pages not to index.” that resides in the root of your Magento installation. The instructions are directives that are recognized and followed by most search engines.

By default, the robots.txt file that is generated by Magento contains instructions for web crawler to avoid indexing certain parts of the site that contain files that are used internally by the system. You can use the default settings, or define your own custom instructions for all, or for specific search engines. There are many articles online that explore the subject in detail.

To configure robots.txt:

1. On the Admin sidebar, tap Content. Then under Design, choose Configuration.
2. Find the Global configuration in the first row of the grid, and click Edit.

Global Design Configuration
3. Scroll down and expand the Search Engine Robots section. Then, do the following:

Search Engine Robots
a. Set Default Robots to one of the following:
  • INDEX, FOLLOW

    Instructs web crawlers to index the site and to check back later for changes.

    NOINDEX, FOLLOW

    Instructs web crawlers to avoid indexing the site, but to check back later or changes.

    INDEX, NOFOLLOW

    Instructs web crawlers to index the site once, but to not check back later for changes.

    NOINDEX, NOFOLLOW

    Instructs web crawlers to avoid indexing the site, and to not check back later for changes.

b. If needed, enter custom instructions into the Edit Custom instruction of robots.txt file box: For example, while a site is in development, you might want to disallow access to all folders.
c. To restore the default instructions, tap Reset to Default.
4. When complete, tap Save Configuration.