The first step is to add the "robots.txt" file and you can put it within the "/public" directory. The content of the robots.txt should be like the way you normally put your conditions for what the crawler can do (allow/disallow the crawler from crawling the URL & etc).
// robots.txt # Block all crawlers for /accounts User-agent: * Disallow: /admin # Allow all crawlers User-agent: * Allow: /
Leave a reply