# Robots.txt editor

{% hint style="info" %}
**Who can use this feature?**

* This feature is available for Pro users.
  {% endhint %}

### What is robots.txt?

Robots.txt is a simple yet powerful tool that guides search engines on how to crawl your website.&#x20;

Think of it as a set of instructions for search engine bots.

A robots.txt file is a simple text document. With this file, you can allow or block access to specific parts of your site.

### Why do you need robots.txt?

Robots.txt helps you control how search engines see your site. You can use it to hide private or duplicate content from search results.

By controlling what gets indexed, you can improve your site's SEO and ensure your most important pages get seen.

{% hint style="info" %}
Learn more about [robots.txt with Google](https://developers.google.com/search/docs/crawling-indexing/robots/intro).
{% endhint %}

### Robots.txt on Shopify

All Shopify stores have a default robots.txt file that has basic optimization for SEO. File is located at the link `yourdomain.com/robots.txt`&#x20;

{% hint style="info" %}
You can check all default rules in [robots.txt file](#manage-robots.txt-file) by filtering with default.
{% endhint %}

### How does Robots.txt editor work?

Shopify Support can't help with edits to the `robots.txt.liquid` file.&#x20;

With Avada Robots.txt editor, you can add new rule to robots.txt file. You can also edit rules which are already in default file.

Before getting to know how to use Robots.txt, we help you go through some terms in this topic.

**Understand terms**

* **User-agent**\
  User agents are like different types of search engine bots. Each bot has a unique name in robots.txt. For example, Google's bot is called "Googlebot".\
  You can set rules for each bot. \
  This lets you control how different search engines interact with your site. You might allow Google to crawl everything but limit other bots.
* **URL path**\
  The URL path of pages that you want search engines to crawl or not to.
* **Rule type**\
  Rule type defines what the user-agent will do with your URL path.&#x20;

#### Add rules

1. Go to **Search appearance** -> Click **Robots.txt**<br>

   <figure><img src="https://154471318-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPf5QlibAKQnTygXuMIzw%2Fuploads%2FZWXb65jtmhGMt9uEsxAG%2Frobots.txt%20image.png?alt=media&#x26;token=04f54536-204a-4dca-a691-9f597b146ff8" alt=""><figcaption></figcaption></figure>
2. Select user-agent and rule type<br>

   <figure><img src="https://154471318-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPf5QlibAKQnTygXuMIzw%2Fuploads%2F6bZ79o9rKageSDChjgbs%2Fselect%20user-agent.png?alt=media&#x26;token=68788e85-de42-4c10-bb92-2bb6eb619624" alt=""><figcaption></figcaption></figure>
3. Enter your URL path. Enter only the path (like "/example") without the full website address or domain name. The path can contain [wildcards](https://docs.avada.io/seo-suite-help-center/search-appearance/robots.txt-editor/wildcards).<br>

   <figure><img src="https://154471318-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPf5QlibAKQnTygXuMIzw%2Fuploads%2FfTPUhW7Co3DFtf9V2826%2Furl%20path.png?alt=media&#x26;token=a223dcef-0ae4-4708-8aea-a0f826f25146" alt=""><figcaption></figcaption></figure>
4. Select **Sitemap** if you want to insert sitemap to your rule.[Why should I add sitemap to my robots.txt file?](#why-should-i-add-sitemap-to-my-robots.txt-file)&#x20;

   [How to set up sitemap?](https://docs.avada.io/seo-suite-help-center/other-features/sitemap-generator/html-sitemap)<br>

   <figure><img src="https://154471318-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPf5QlibAKQnTygXuMIzw%2Fuploads%2FFDTdHjs6OjDRaQLuzdob%2Fsitemap%20in%20robots.txt%20x.png?alt=media&#x26;token=e71e8a31-c761-43fd-81ef-08b428d34c89" alt=""><figcaption></figcaption></figure>
5. Click **Add rule**.&#x20;
6. After adding rule, your rule will be shown in the Robots.txt file below.<br>

   <figure><img src="https://154471318-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPf5QlibAKQnTygXuMIzw%2Fuploads%2FVnyzSXvrj9FKGlKi408u%2Fnewly%20added%20rule.png?alt=media&#x26;token=22ab1bbc-a7e7-4707-a029-d50df9441e87" alt=""><figcaption></figcaption></figure>

#### Manage Robots.txt file

With Robots.txt file, you can:

* View Robots.txt file in the link&#x20;
* Search rule by URL path or filter by user-agent, status, rule type
* Change rule type
* Delete rule
* View Robots.txt file

<figure><img src="https://154471318-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPf5QlibAKQnTygXuMIzw%2Fuploads%2FcaJxUX87eBt4yqULx0F2%2Fclick%20to%20view%20robots%20fil.png?alt=media&#x26;token=042f1b65-e731-4f51-a001-d9a5def9e433" alt=""><figcaption></figcaption></figure>

<figure><img src="https://154471318-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPf5QlibAKQnTygXuMIzw%2Fuploads%2FBhk3bkg9gdFWWewiazKQ%2Fhow%20robots%20file%20look.png?alt=media&#x26;token=bbe168c6-354d-4009-a6b9-2b20379d3273" alt=""><figcaption></figcaption></figure>

* Search and find rules

<figure><img src="https://154471318-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPf5QlibAKQnTygXuMIzw%2Fuploads%2FWrT98HOPDuXOPxH2WxJ1%2Fsearch%20and%20filter.png?alt=media&#x26;token=e6a68f86-c58d-4ec8-b704-a9f566dce8ff" alt=""><figcaption></figcaption></figure>

You can click icon <img src="https://154471318-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPf5QlibAKQnTygXuMIzw%2Fuploads%2Fr6aRthuW0ZSC4fVVxkOq%2Ficon%20up.png?alt=media&#x26;token=a2d93ef9-2d7b-4b80-8875-392d9f4d2e2d" alt="" data-size="line"> to collapse and expand a group of rules to find rules easier.

<figure><img src="https://154471318-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPf5QlibAKQnTygXuMIzw%2Fuploads%2FlgCNsXEyFb2GfGTW4jBa%2Fcollapse%20and%20expand.png?alt=media&#x26;token=62bb9eb0-c017-4899-b80e-1bc3a709c626" alt=""><figcaption></figcaption></figure>

* Change rule type

<figure><img src="https://154471318-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPf5QlibAKQnTygXuMIzw%2Fuploads%2FC6whLWgRQ1aOVcF12cQ9%2Fchange%20rule%20type.png?alt=media&#x26;token=3309d26e-586f-4981-93a5-ed982657a82d" alt=""><figcaption></figcaption></figure>

* Delete rules

{% hint style="warning" %}
You cannot delete default rules but change rule type.
{% endhint %}

<figure><img src="https://154471318-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FPf5QlibAKQnTygXuMIzw%2Fuploads%2F9LiNdifWcDHCA73ZgDUL%2Fdelete%20rule.png?alt=media&#x26;token=11e60f75-2cc0-4346-bcc0-a341d0fce121" alt=""><figcaption></figcaption></figure>

<details>

<summary>Why should I add sitemap to my robots.txt file?</summary>

The robots.txt file is the first document search engine bots look at when they visit your website. It gives them information about what is available to crawl.

Adding sitemap in robots.txt file helps search engine find your sitemap faster and understand the structure of your website for indexing.

</details>
