Home / Tutorials / How to implement a robots.txt file in a Nuxt Js project
How to implement a robots.txt file in a Nuxt Js project cover

How to implement a robots.txt file in a Nuxt Js project

Learn how to add robots.txt in a Nuxt Js project to allow and disallow user agent from crawling the website

6 mins

3.7K

3 years ago

0 comments

Expert

Having a robots.txt file is very important as it helps control Google and other search engines such as Bing to index website content. This is because the first thing crawler check when visiting the website is if robots.txt exists, hence it determine when content should be crawled or not.
There are several ways to add robots.txt and it's as easy as writing down manually in the "statics" folder. But for this tutorial, we'll be using nuxtjs/robots as it's more flexible and the content of the robots.txt can be easily manipulated.

1 - Install nuxtjs/robots Package


First thing first, install the robots package and define it in the modules array of the nuxt.config.js
yarn add @nuxtjs/robots
To define the robots config we can pass an object, array, or function where each method has its own use cases.
export default {
  modules: [
    '@nuxtjs/robots'
  ],
  robots: {
    /* module options */
  }
}

2 - Simple configuration


In this case, define the code as following and it will allow all user agents (bot) to crawl the site. In contrast, if the value of DIsallow is "/" then we are not allowing it to crawl any of the pages.
export default {
  robots: {
    UserAgent: '*',
    Disallow: ''
  }
}

Multiple user agents configuration


If you want to specify a configuration for a different user agent, do pass the array of objects as the robots value and it will behave as defined.
export default {
  robots: [
    {
      UserAgent: 'Googlebot',
      Disallow: '/user',
    },
    {
      UserAgent: '*',
      Disallow: '/admin',
    },
  ]
}

Function configuration


You can also pass in a function as the robot's value and in this case define logic or conditionally define the value of the robot you want it to be.
export default {
  robots: () => {
    if (someLogicHere) {
      return {
        UserAgent: '*',
        Disallow: '/'
      }
    }
  }

3 - yarn dev / npm run dev


Finally, run "yarn dev" and now you can visit /robots.txt to see the value of the robots.txt that you have defined.
User-agent: Googlebot
Disallow: /users
User-agent: Bingbot
Disallow: /admin
notion avatar

Alaz

Week-end developer currently experimenting with web, mobile, and all things programming.

Support Us

If you like our tutorial, support us by being our Patreon or buy us some coffee ☕️

Welcome to PostSrc V3

PostSrc Dark Logo

You have to login to favorite this