Robots.txt and Ecommerce SEO

 
Robots.txt and Ecommerce SEO
 

Robots.txt files are an important tool to have in your technical SEO tool belt. Much like other technical SEO elements (such as schema markup, canonical links, and 301 redirects) the goal of robots.txt is to optimize your ecommerce website to make it easier for search engines to understand and crawl.

Although every website has a robots.txt file, many webmasters neglect this small yet vital file and miss out on the SEO help it can offer. Keep reading to learn how robots.txt files work and how they can benefit your ecommerce SEO initiatives.

What is a Robots.txt File?

Robots.txt (also known as the robots exclusion protocol) is a text file that contains instructions for website crawlers in terms of which pages or files they are allowed to crawl and which they are not. 

Why would telling bots not to crawl certain pages be beneficial? 

All search engine bots have a crawl budget that limits the number of URLs they can and want to crawl on your website. SEO experts use robots.txt files and other tools to help search engines understand the structure of their site better and what’s important to crawl.

A properly formatted robots.txt file also includes a link to your XML sitemap, which is used to help search engines like Google identify your most important pages. The robots.txt sitemap directive allows search engines to quickly and easily find the XML sitemap on your site. 

Robots.txt Files for Ecommerce

So, what role does robots.txt play in ecommerce SEO? By using robots.txt strategically, you can tell search engine bots to ignore certain low-value pages, and free up their crawl budgets to be spent on more valuable pages instead (i.e. the pages that generate sales and revenue). 

For ecommerce sites, pages with low SEO value that should generally be ignored by search engine crawlers include customer-facing pages and folders such as:

  • www.example.com/cart

  • www.example.com/account

  • www.example.com/login

  • www.example.com/checkout

Pages with thin or duplicate content (such as filtered category pages or admin login pages) should also be excluded. For example, when shoppers use a filter to search through the products on your ecommerce website, this generates a page that shows nearly identical content to other filtered pages. 

These pages are highly useful for shoppers, but can eat up your crawl budget — in other words, bots might waste valuable time crawling these URLs instead of your primary category pages and product pages, thereby hurting your ranking and sales potential. 

By editing the robots.txt file to instruct bots not to crawl these pages, your ecommerce website will be crawled more strategically, and your most important pages will have a better chance of ranking. Over time this can translate to more organic search traffic and revenue. 

Each ecommerce platform has different settings for configuring and managing robots.txt files. Some platforms (such as Magento and, very recently, Shopify) allow for custom configuration of the robots.txt file, while others have the robots.txt file locked.

If you cannot edit your robots.txt, it is still possible to hide additional pages from search engine crawlers by using noindex metatags on those pages. We discuss these more in the next section.

Robots.txt File Limitations

It is important to note that the robots.txt file is a suggestion to the search engines on what to index and what not to index. Google, Bing, and other search engines typically follow these directives, but they can still index at their own discretion. 

Even if you have instructed bots not to crawl a certain URL, they may decide to index the page anyways, especially if it is being linked to from another site. Additionally, different search engines interpret the syntax of robots.txt directives differently. Thus a page may be indexed on one search engine but not another, or may not be blocked from 100% of searchers.

To ensure a page is not crawled and indexed by Google, a better solution would be placing noindex robot directive metatags on the pages themselves. These tags specifically instruct search engines not to show certain URLs in their search results.

Boost Organic Traffic & Sales with Robots.txt SEO

When used in conjunction with other technical SEO best practices such as an XML sitemap and noindex metatags, a robots.txt file can help ensure the best version of your ecommerce site is crawled and indexed, enabling it to draw more organic traffic and sales. 

Because editing this file has the potential to impact your search engine rankings and traffic, it should only be modified by someone who understands robots.txt SEO best practices.


Need help managing your robots.txt file or other technical SEO details? Contact Whitecap SEO for more information and a free quote.


Previous
Previous

Noindex Tags: What You Need to Know for Ecommerce SEO

Next
Next

High-Converting Design Tips for Ecommerce SEO