Wildcards

How to use wildcards in URL path for robots.txt file

What are wildcards?

Wildcards in robots.txt are special characters that help you create flexible rules, like *, $

They're like a shortcut to match multiple pages or folders at once.

Wildcards are helpful for big websites with lots of pages. They make your robots.txt file simpler and easier to manage.

There are 2 wildcards that are often used in robots.txt.

  • *

  • $

How to use wildcards?

  1. *

The * wildcard matches any sequence of characters. It's like a blank space that can be filled with anything.

For example:

/blog* matches /blog, /blog1, /blogpost, etc.

  1. $

The $ wildcard marks the end of a URL. It ensures the rule only applies to exact matches.

For example:

/page$ only matches /page, not /page1 or /page/subpage

Last updated