1 - Install nuxtjs/robots Package
First thing first, install the robots package and define it in the modules array of the nuxt.config.js
yarn add @nuxtjs/robots
export default { modules: [ '@nuxtjs/robots' ], robots: { /* module options */ } }
2 - Simple configuration
In this case, define the code as following and it will allow all user agents (bot) to crawl the site. In contrast, if the value of DIsallow is "/" then we are not allowing it to crawl any of the pages.
export default { robots: { UserAgent: '*', Disallow: '' } }
Multiple user agents configuration
If you want to specify a configuration for a different user agent, do pass the array of objects as the robots value and it will behave as defined.
export default { robots: [ { UserAgent: 'Googlebot', Disallow: '/user', }, { UserAgent: '*', Disallow: '/admin', }, ] }
Function configuration
You can also pass in a function as the robot's value and in this case define logic or conditionally define the value of the robot you want it to be.
export default { robots: () => { if (someLogicHere) { return { UserAgent: '*', Disallow: '/' } } }
3 - yarn dev / npm run dev
Finally, run "yarn dev" and now you can visit /robots.txt to see the value of the robots.txt that you have defined.
User-agent: Googlebot Disallow: /users User-agent: Bingbot Disallow: /admin