SEO Suite Help Center
Go to appAvada products
  • Welcome
  • 🚀GETTING STARTED
    • Intro to SEO Suite
    • Quick start guide
    • Tutorials videos
    • SEO Dictionary
    • Pricing
  • 📑SEO Audit
    • Overview
    • SEO checklist
    • On-page SEO
      • Checklist
      • Keyword research
      • Collection page
      • FAQs builder
      • Social meta tags
  • 🖼️Image optimization
    • Overview
    • Image optimization manager
    • Compress image tool
  • ⚡Site speed up
    • Speed up
      • Compare modes
      • Web performance
    • Speed up - custom mode
      • JS deferral
      • Style optimization
      • Assets optimization
      • Lazy loading
      • Minification
      • Instant page
      • Preconnect
      • Font-display: swap
      • 🧩 Critical CSS
      • ⚡ Hyperspeed
      • 🔄 Page Loader
  • 🔎Search appearance
    • Meta tags
      • Meta tags basic
      • Meta tags rule
      • Custom meta tags
      • Variables
    • Image
      • Example
    • Instant indexing
      • Google Indexing API
    • Social networks
    • Google structured data
      • Rich results
      • Test Google Structured Data
    • Robots.txt editor
      • Wildcards
  • 🛠️other features
    • Site verification
    • Google search consoles
    • Broken link manager
      • 301 and 302 redirect
    • Sitemap generator
      • XML sitemap
      • HTML sitemap
    • Email notification
    • Shopify Flow
    • Settings
  • 🧩Integration
    • Air reviews
    • Ali reviews
    • Judge.me
    • LAI Reviews
    • Loox
    • eComposer
    • Gempages
  • 💻Knowledge hub
    • Search Engine Optimization (SEO) 101
    • Hands-on guide to improve on-page product SEO audit score
    • Basic Core Web Vitals
    • Web performance and speed with Shopify eCommerce in 2024
    • The Google Algorithm leak and what it has to do with your SEO in Shopify
    • Critical CSS Extraction deep dive
    • Does outbound links matter in SEO?
    • Tips writing your meta title
    • Writing a good product description that sales and "SEO"
    • App Deferral for Shopify Store Speed Optimization deep dive
    • Google update 2024
  • 💸Referral program
  • ⁉️FAQs
  • 🔏Privacy Policy
Powered by GitBook
On this page
  • What is robots.txt?
  • Why do you need robots.txt?
  • Robots.txt on Shopify
  • How does Robots.txt editor work?

Was this helpful?

  1. Search appearance

Robots.txt editor

Robots.txt tells search engines which pages they should or shouldn't crawl

Who can use this feature?

  • This feature is available for all users.

What is robots.txt?

Robots.txt is a simple yet powerful tool that guides search engines on how to crawl your website.

Think of it as a set of instructions for search engine bots.

A robots.txt file is a simple text document. With this file, you can allow or block access to specific parts of your site.

Why do you need robots.txt?

Robots.txt helps you control how search engines see your site. You can use it to hide private or duplicate content from search results.

By controlling what gets indexed, you can improve your site's SEO and ensure your most important pages get seen.

Learn more about robots.txt with Google.

Robots.txt on Shopify

All Shopify stores have a default robots.txt file that has basic optimization for SEO. File is located at the link yourdomain.com/robots.txt

You can check all default rules in robots.txt file by filtering with default.

How does Robots.txt editor work?

Shopify Support can't help with edits to the robots.txt.liquid file.

With Avada Robots.txt editor, you can add new rule to robots.txt file. You can also edit rules which are already in default file.

Before getting to know how to use Robots.txt, we help you go through some terms in this topic.

Understand terms

  • User-agent User agents are like different types of search engine bots. Each bot has a unique name in robots.txt. For example, Google's bot is called "Googlebot". You can set rules for each bot. This lets you control how different search engines interact with your site. You might allow Google to crawl everything but limit other bots.

  • URL path The URL path of pages that you want search engines to crawl or not to.

  • Rule type Rule type defines what the user-agent will do with your URL path.

Add rules

  1. Go to Search appearance -> Click Robots.txt

  2. Select user-agent and rule type

  3. Enter your URL path. Enter only the path (like "/example") without the full website address or domain name. The path can contain wildcards.

  4. Select Sitemap if you want to insert sitemap to your rule.Why should I add sitemap to my robots.txt file?

    How to set up sitemap?

  5. Click Add rule.

  6. After adding rule, your rule will be shown in the Robots.txt file below.

Manage Robots.txt file

With Robots.txt file, you can:

  • View Robots.txt file in the link

  • Search rule by URL path or filter by user-agent, status, rule type

  • Change rule type

  • Delete rule

  • View Robots.txt file

  • Search and find rules

  • Change rule type

  • Delete rules

You cannot delete default rules but change rule type.

Why should I add sitemap to my robots.txt file?

The robots.txt file is the first document search engine bots look at when they visit your website. It gives them information about what is available to crawl.

Adding sitemap in robots.txt file helps search engine find your sitemap faster and understand the structure of your website for indexing.

PreviousTest Google Structured DataNextWildcards

Last updated 10 months ago

Was this helpful?

You can click icon to collapse and expand a group of rules to find rules easier.

🔎