Posts Learn Components Snippets Categories Tags Tools About
/

How to implement a robots.txt file in a Nuxt Js project

Learn how to add robots.txt in a Nuxt Js project to allow and disallow user agent from crawling the website

3 years ago

6 mins read

3624 views

Having a robots.txt file is very important as it helps control Google and other search engines such as Bing to index website content. This is because the first thing crawler check when visiting the website is if robots.txt exists, hence it determine when content should be crawled or not.
There are several ways to add robots.txt and it's as easy as writing down manually in the "statics" folder. But for this tutorial, we'll be using nuxtjs/robots as it's more flexible and the content of the robots.txt can be easily manipulated.

1 - Install nuxtjs/robots Package


First thing first, install the robots package and define it in the modules array of the nuxt.config.js
yarn add @nuxtjs/robots
To define the robots config we can pass an object, array, or function where each method has its own use cases.
export default {
  modules: [
    '@nuxtjs/robots'
  ],
  robots: {
    /* module options */
  }
}

2 - Simple configuration


In this case, define the code as following and it will allow all user agents (bot) to crawl the site. In contrast, if the value of DIsallow is "/" then we are not allowing it to crawl any of the pages.
export default {
  robots: {
    UserAgent: '*',
    Disallow: ''
  }
}

Multiple user agents configuration


If you want to specify a configuration for a different user agent, do pass the array of objects as the robots value and it will behave as defined.
export default {
  robots: [
    {
      UserAgent: 'Googlebot',
      Disallow: '/user',
    },
    {
      UserAgent: '*',
      Disallow: '/admin',
    },
  ]
}

Function configuration


You can also pass in a function as the robot's value and in this case define logic or conditionally define the value of the robot you want it to be.
export default {
  robots: () => {
    if (someLogicHere) {
      return {
        UserAgent: '*',
        Disallow: '/'
      }
    }
  }

3 - yarn dev / npm run dev


Finally, run "yarn dev" and now you can visit /robots.txt to see the value of the robots.txt that you have defined.
User-agent: Googlebot
Disallow: /users
User-agent: Bingbot
Disallow: /admin

Alternative Tags

If you like our tutorial, do make sure to support us by being our Patreon or buy us some coffee ☕️

new

PostSrc Code Snippets

Learn new snippets today, level up your Laravel, Alpine JS, Vue JS, Tailwind CSS skills and more.

Learn New Snippets

Sponsors 👑

+ Add Yours
new

PostSrc Code Components

Collection of Tailwind CSS components for everyone to use. Browse all of the components that are right for your project.

View Components
)