Robots.txt editor
Robots.txt tells search engines which pages they should or shouldn't crawl
Who can use this feature?
This feature is available for all users.
What is robots.txt?
Robots.txt is a simple yet powerful tool that guides search engines on how to crawl your website.
Think of it as a set of instructions for search engine bots.
A robots.txt file is a simple text document. With this file, you can allow or block access to specific parts of your site.
Why do you need robots.txt?
Robots.txt helps you control how search engines see your site. You can use it to hide private or duplicate content from search results.
By controlling what gets indexed, you can improve your site's SEO and ensure your most important pages get seen.
Learn more about robots.txt with Google.
Robots.txt on Shopify
All Shopify stores have a default robots.txt file that has basic optimization for SEO. File is located at the link yourdomain.com/robots.txt
You can check all default rules in robots.txt file by filtering with default.
How does Robots.txt editor work?
Shopify Support can't help with edits to the robots.txt.liquid
file.
With Avada Robots.txt editor, you can add new rule to robots.txt file. You can also edit rules which are already in default file.
Before getting to know how to use Robots.txt, we help you go through some terms in this topic.
Understand terms
User-agent User agents are like different types of search engine bots. Each bot has a unique name in robots.txt. For example, Google's bot is called "Googlebot". You can set rules for each bot. This lets you control how different search engines interact with your site. You might allow Google to crawl everything but limit other bots.
URL path The URL path of pages that you want search engines to crawl or not to.
Rule type Rule type defines what the user-agent will do with your URL path.
Add rules
Go to Search appearance -> Click Robots.txt
Select user-agent and rule type
Enter your URL path. Enter only the path (like "/example") without the full website address or domain name. The path can contain wildcards.
Select Sitemap if you want to insert sitemap to your rule.Why should I add sitemap to my robots.txt file?
Click Add rule.
After adding rule, your rule will be shown in the Robots.txt file below.
Manage Robots.txt file
With Robots.txt file, you can:
View Robots.txt file in the link
Search rule by URL path or filter by user-agent, status, rule type
Change rule type
Delete rule
View Robots.txt file
Search and find rules
You can click icon to collapse and expand a group of rules to find rules easier.
Change rule type
Delete rules
You cannot delete default rules but change rule type.
Last updated