Marketing

9 Tips for Improving Crawl Budget

By June 23, 2020 No Comments
9 tips for improving crawl budget

Google and other search engines rely on crawler bots to “read” your site and understand its content. This is part of the process search engines use to list your website when a user enters a search term. However, these bots can’t crawl all your pages at once. They rely on a crawl budget. A crawl budget limits how many pages Googlebot and other bots can crawl on your site in a given time period. In this blog post, we’ll explain why a crawl budget is important, and tips for improving your crawl budget.

What is a Crawl Budget?

Before improving your crawl budget, it is important to understand how it works and why it’s important.

Search engine bots are programmed to run through pages on your website and index them. This process is called crawling. When a page is crawled and indexed, the search engine can list it when a user types a relevant search. If the search engine doesn’t know the page exists, it can’t list it on a search engine results page (SERP).

Why Does My Crawl Budget Matter?

Your crawl budget limits how many pages search engine bots can crawl in a given time. When you create new pages, make changes to your site architecture, or make SEO updates, your pages must be crawled for search engines to show these new pages or changes in search results. This means, you won’t benefit from increased traffic until these pages are crawled. If your crawl budget is too low for your website size, it creates a bottleneck that limits the impacts of your new content or SEO efforts. This is why improving your crawl budget can be helpful if you’re not seeing results from changes you’re making.

Is My Crawl Budget Too Low?

Crawl bots are designed to work quickly. According to Google experts, if you have less than 1,000 unique URLs, your crawl budget is probably not a problem. However, this doesn’t apply only to unique pages. If, for example, you have a product marketplace with a number of different filters, each product page will be organized under different URLs, and each will be crawled individually. If you have a large number of pages and filters, these tips to improve your crawl budget will be helpful.

You can check your crawl budget using the Crawl Stats Report in Google’s Search Console. This will show you how much Google’s crawling bots have crawled in the last 90 days. There is no particular number that indicates an ideal crawl budget, but your activity should be relatively stable.

Google and other search engines should also be able to crawl your site relatively quickly. If Google crawls 100 unique URLs a day, but you have 10,000 URLs, it will take 10 days to crawl your site completely. In this case, it will be a good ideal to improve your crawl budget. Start with these tips:

9 Tips for Improving Crawl Budget

1. Take a Look At Your Server Logs

Your site’s server logs can help you understand the current status of your crawl and start to optimize your crawl budget. This gives you an inside look at your site’s activity. You may need some technical expertise to sort through this information. However, server logs will tell you where site errors are coming from, so you can fix them.

If you can’t get to your server logs, or you’re not sure what the information on your server logs means, don’t worry. This information is helpful, but it isn’t essential.

2. Improve Site Speed

Site load speed is vital to improving your crawl budget, but also to making your site successful in general. Improving your site load speed can help to increase traffic, improve SEO, and create a better user experience, while also optimizing your crawl budget. If your pages load quickly, bots can crawl them faster, just like users can see them faster. Large images or videos, demanding plugins, weighty JavaScript and CSS, among other things, can cause your site to load slowly.

3. Reduce Errors

If a bot moves too quickly through your website, it can demand too much from the server, and cause the site to slow down and produce errors. Since they are programmed to simply recover information, and not hurt the functionality of the site, crawling bots are programmed to avoid doing this. So, if a bot encounters a number of errors, regardless of their source, it will crawl more slowly, or stop. You can find and resolve these errors using Google’s Index coverage report and URL Inspection tool.

4. Reduce Redirects

If the crawler has to follow a series of redirects, it becomes more likely that the page won’t be crawled. For example, if you change your URLs when you move to a new domain, then you change them again to https instead of http when you add an SSL certificate, the crawler might not follow all of the redirects, and instead just move on to the next URL. Try to reduce your redirects as much as possible.

5. Consolidate Pages

The more pages you have, the more time it takes for bots to crawl and index the pages. If you have a number of pages that aren’t providing value, consider doing a content audit. Try consolidating these pages, removing those you don’t need anymore, or blocking them from the crawl (see next paragraph).

6. Block Uncrawlable Pages

Not all pages need to show up in a search. For some pages, such as gated content or content for members only, it might be better if they don’t. You can stop bots from crawling and indexing these pages by using your robots.txt file. Set this to “noindex” and these pages won’t be crawled.

7. Build More Links

Google bots and other search engine crawlers navigate the web in ways similar to users. Bots use links to get from one page to another. This means pages with more links, including internal and external, will be easier for the bots to find, crawl, and index. Run through your content and see where you can create internal links between your own pages. Use external link building strategies to generate links from other sites. This will not only help to improve your crawl budget, but it will also improve your SEO.

8. Fix Orphan Pages

An orphan page has no links from any other pages. It’s isolated from the rest of your site, so it’s hard for crawl bots to find. You might add links to this page in your sitemap, add calls-to-action from other relevant pages, or create relevant text links on informational pages. This will help to improve your crawl budget and get these pages crawled faster.

9. Limit Duplicate Content

When Googlebot and other crawlers encounter the same content on different pages, the bot assumes that these pages aren’t original and therefore aren’t very valuable, so they’ll fall to the bottom of the crawl list, if they are crawled at all. Duplicate content across different domains can hurt your SEO, while duplicate content on the same domain can hurt your crawl budget. In some cases, you might have duplicate content on unique pages. For example, a product might have different pages for each available color, but the same description. In this case, consider consolidating the pages, changing the content so it is unique, or using filters. If you use filters to organize pages, you may have a dozen different URLs pointing to the same page. This is most common for online product marketplaces. To improve your crawl budget in this case, use Google’s recommended best practices for faceted navigation.

 

Remember, if you have less than 1,000 unique URLs, you probably don’t have to worry about your crawl budget. However, these tips for improving your crawl budget can also help in other areas, like SEO and user experience. As you expand your site, regularly monitor your site errors, redirects, load speed, and other factors mentioned here, so you don’t have to make a big overhaul later down the road.

Ultimate Guide to Increasing Your Google Rankings

See how Google ranks your content and how to improve it.

Download the guide »https://js.hscta.net/cta/current.js hbspt.cta.load(529469, ‘0f4944e3-b200-491c-b61e-2b9802884f64’, {});