# Wildcards

### What are wildcards?

Wildcards in robots.txt are special characters that help you create flexible rules, like `*`, `$`

They're like a shortcut to match multiple pages or folders at once.&#x20;

Wildcards are helpful for big websites with lots of pages. They make your robots.txt file simpler and easier to manage.

There are 2 wildcards that are often used in robots.txt.

* `*`
* `$`&#x20;

### How to use wildcards?

1. `*`

The `*` wildcard matches any sequence of characters. It's like a blank space that can be filled with anything.&#x20;

For example:&#x20;

`/blog*` matches `/blog`, `/blog1`, `/blogpost`, etc.

2. `$`

The `$` wildcard marks the end of a URL. It ensures the rule only applies to exact matches.&#x20;

For example:

`/page$` only matches `/page`, not `/page1` or `/page/subpage`
