Definition

robots.txt

Updated Feb 6, 2025

Definition

robots.txt is a site-wide rules file that tells crawlers which paths they are allowed or not allowed to fetch.

Plain English

It's the first gate crawlers hit. You can allow or block specific bots from specific parts of your site.

Why it matters

  • Blocking a bot means it can't crawl those pages.
  • If blocked, bots can't read meta tags like noindex either.
  • Allow AI crawlers if you want to be discovered in AI search.