Posts Learn Components Snippets Categories Tags Tools About
/

How to add a robots.txt file to a Next.js project

Get to know how to add robots.txt file to a Next.js project the easy way for beginners

Created on Feb 12, 2022

697 views

In this short snippet, you will learn how to add robots.txt in the Next.js project to tell search engines which pages/files the crawler can and can't request from your site.

The first step is to add the "robots.txt" file and you can put it within the "/public" directory. The content of the robots.txt should be like the way you normally put your conditions for what the crawler can do (allow/disallow the crawler from crawling the URL & etc).
// robots.txt

# Block all crawlers for /accounts
User-agent: *
Disallow: /admin

# Allow all crawlers
User-agent: *
Allow: /
And now when you do run "yarn dev" you will be able to access it by passing "robots.txt" at the end of the URL http://localhost:3000/robots.txt. Do note that the path will be the same on production as well.

If you like our tutorial, do make sure to support us by being our Patreon or buy us some coffee ☕️

)