Sitemaps & Robots.txt

For your web app, it'll be useful to have a sitemap and a robots.txt file. With these files, you can control how crawlers communicate with your app.

For example, you can:

  • Define pages & paths in your app which are private
  • Define archived pages in your app
  • Ask search engines to index certain pages faster than others
šŸ‘‹ Using a sitemap may not improve your page's ranking. This is just a hint for crawlers to follow.

While working on this lesson, we will be learning about Next.js rewrites & the public directory.

šŸ™ We need your help

We need your help to keep maintain & add new content to this course. Here's how you can support us: