How To Add Robots.txt in your Nuxtjs Application


Hello everyone here in this article we will see how to add robot.txt file in our nuxtjs application after it is build or generate before deployment.

Before learning the steps to add robot.txt in the nuxt app, let us learn about what and why robot.txt is important for a ssr or static generated site.

What is Robots.txt file?

Robots.txt also known as the exclusion standard or protocol is a way how a website communicate with the web crawlers and other web robots. It lets the web crawlers know that this are the areas or pages thats needs to be crawl in the website. It also tells the crawlers which pages not to scan or crawl for indexing.

How Robots.txt works ?

Web Search Engines sends out tiny programs called robots or spiders to crawl your website and fetch all the pages and its informations to be indexed in the search engines. Here the robots.txt file tell this crawlers bots which pages to scan and index and where pages are not.

You can tell this to but the Allow and Disallow commands in the robots.txt file of your website. For example

User-agent: *

Disallow: /admin

Allow: /

As you can see in the above codes, we have disallow the web crawlers not to index our admin page and allowed rest of the pages to get indexed in the search engine.

Add Robots.txt in your Nuxtjs Application automatically.

To add the robots.txt file in our Nuxtjs Application we will be using a node module which will automatically add the file during our build of the website.

The npm package name is @nuxtjs/robots - npm . What this packages does is, it injects a middleware to generate the Robots.txt file.

Follow the steps to install the package and add robots.txt file in nuxt project.

Step 1: Add the packages to your project by :

yarn add @nuxtjs/robots # or npm install @nuxtjs/robots

Step 2: Goto your nuxt.config.js file and add the following line to your modules section:

export default {
  modules: [
    '@nuxtjs/robots'
  ],
  robots: {
    /* module options */
  }
}

Step 3: In the module option section you can set the commands for your robots.txt file. For Example

export default {
   modules: [
    '@nuxtjs/robots'
  ],
  robots: {
    UserAgent: '*',
    Disallow: '/admin'
    Allow: '/'
  }
}

It means allow all the pages of the website and disallow the admin pages.

Once done just restart your project and when you generate or build the project, it will automatically create the file and you can find it in your dist folder.


💾 Recommended Articles