Home / Snippets / How to add a robots.txt file to a Next.js project
How to add a robots.txt file to a Next.js project cover

How to add a robots.txt file to a Next.js project

713

2 years ago

0 comments

In this short snippet, you will learn how to add robots.txt in the Next.js project to tell search engines which pages/files the crawler can and can't request from your site.

The first step is to add the "robots.txt" file and you can put it within the "/public" directory. The content of the robots.txt should be like the way you normally put your conditions for what the crawler can do (allow/disallow the crawler from crawling the URL & etc).
// robots.txt

# Block all crawlers for /accounts
User-agent: *
Disallow: /admin

# Allow all crawlers
User-agent: *
Allow: /
And now when you do run "yarn dev" you will be able to access it by passing "robots.txt" at the end of the URL http://localhost:3000/robots.txt. Do note that the path will be the same on production as well.
notion avatar

Alaz

Week-end developer currently experimenting with web, mobile, and all things programming.

Topics:

Frontend

Resource

Average

Average

Support Us

If you like our tutorial, support us by being our Patreon or buy us some coffee ☕️

Welcome to PostSrc V3

PostSrc Dark Logo

You have to login to favorite this