Edit robots.txt

Change the website crawler status of your website

This settings is normally only available to superadmin users.

When a site owner wishes to give instructions to web robots they place a text file called robots.txt in the root of the website files. This text file contains the instructions in a specific format.

Normally this file can be found on any website using this URL format-https://www.example.com/robots.txt

The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.

More on Wikipedia