Challenges

Sitemaps & Robots.txt

Create the "robots.txt"

Now we need to create the /robots.txt in our app. To do that, create a file called public/robots.txt and add the following content:

User-agent: *
Disallow: /api
Sitemap: https://yourapp.com/sitemap.xml

This is what's happening here:

  • With User-agent: *, we mention these rules applies to all crawlers
  • Then we ask to stop visiting URLs starting with /api with Disallow: /api
  • Finally we define the location of our sitemap.xml file

Now you can visit http://localhost:3009/robots.txt to view our robots.txt file.

Check this Wikipedia article to learn more about robots.txt specification.

šŸ’” Here we create this robots.txt file inside a directory called public. You can serve static assets like images, text files, etc by keeping them inside the public directory.
Q: So, is this the standard way to host images, videos with Next.js?`
authenticating...

šŸ™ We need your help

We need your help to keep maintain & add new content to this course. Here's how you can support us: