How to use wildcards in URL path for robots.txt file
Wildcards in robots.txt are special characters that help you create flexible rules, like *, $
*
$
They're like a shortcut to match multiple pages or folders at once.
Wildcards are helpful for big websites with lots of pages. They make your robots.txt file simpler and easier to manage.
There are 2 wildcards that are often used in robots.txt.
The * wildcard matches any sequence of characters. It's like a blank space that can be filled with anything.
For example:
/blog* matches /blog, /blog1, /blogpost, etc.
/blog*
/blog
/blog1
/blogpost
The $ wildcard marks the end of a URL. It ensures the rule only applies to exact matches.
/page$ only matches /page, not /page1 or /page/subpage
/page$
/page
/page1
/page/subpage
Last updated 1 year ago
Was this helpful?