You can block these AI user-agents in a similar manner as you would block Google crawlers; by replacing the default robots.txt file with a new file that specifies disallow
rules for specific AI user-agents.
Warning
The Book Like A Boss platform does not validate custom files. For example, if a corrupt file is uploaded, it will still be served.
To block both ChatGPT and Google-Extended crawlers:
Create a new robots.txt file. We recommend following Google’s instructions on how to create a robots.txt file.
Add the following code to the new robots.txt file. Note that crawlers process robots.txt from top to bottom, so we do not recommend adding the wildcard directive at the top.
# Sitemap is also available on /sitemap.xml
Sitemap: http://www.example.com/sitemap.xml
User-agent: GPTBot
Disallow: /
User-agent: Google-Extended
Disallow: /
User-agent: *
(Optional) If you need to add other groups, follow the same format of:
User-agent: ????
Disallow: /
Replace the default robots.txt file with the new file. To learn how, see Site Configuration Files. It is important to note that in order to replace the default file, the Source URL must match the file name exactly.