2 min read

Google removes URL parameter tool – here’s what you need to know

Here’s what you need to know about Google removing its URL parameters tool… 👇🏻

Later this month, Google will be removing its URL parameters tool. 

Originally it was used to give site owners granular control over how Google crawled their site. But with Google’s technology getting better at guessing which parameters are useful and which are not, the tool is now of very little value. 

So, what do you need to do?

Absolutely nothing (kind of). 

Google has said, ‘Going forward you don’t need to do anything to specify the function of URL parameters on your site, Google’s crawlers will learn how to deal with URL parameters automatically.

However, if you are looking to have some control over your site, here is what you should do: 

Use the Robot.txt file to communicate with Google which pages you don’t want indexing. 

This may require you to use a developer, however, it’s the best way to set specific parameters on your website. 

Why is this important? 

Introducing Index bloat and crawl budget. 

What is Index Bloat? 

Index bloat is where search engines index pages on your website that aren’t very useful and have no benefits to appearing in their search results.

For example, if a page has duplicate content or low-quality content, you don’t need it to show up. Save the rankings for good quality content.

This is important to be aware of as in some cases this can cause your better quality pages to actually rank lower due to the duplicate content. In a nutshell, the poor quality pages could end up stealing the organic traffic due to rankings.

If it is the case that you have pages that fit the ‘poor-quality’ category, you can either delete them or make sure Google doesn’t index them. Previously, this is where you would’ve used the URL parameters tool.

What is a crawl budget?

A crawl budget is the total number of pages a search engine is willing to crawl. Some websites have minimal pages whilst others have thousands. In order to give every website equal opportunities to be crawled, they have a set time allocation for each one.

If you have lots of low-quality pages, Google may spend your crawl budget on these leaving your high-quality pages within being indexed, which may result in the lower-quality ones ranking and the higher-quality pages not getting the rankings they deserve.

In conclusion: 

  1. Optimise your high-quality pages to ensure they receive the best rankings possible
  2. Delete unwanted pages or use the index.txt file to inform Google not to index
  3. Speak to an SEO expert for more advice if you get stuck

Here’s Google’s introduction to Robots.txt tool to help you get started.

We know technical SEO can be tricky so as always, you can pick the easy route: Ask a professional for help! 

UA to GA4: The Shift You Can't Ignore

3 min read

UA to GA4: The Shift You Can't Ignore

Since its inception in 2007, Google Analytics has been instrumental in tracking website performance and marketing outcomes. The Universal Analytics...

Read More
the power of Collaborative content planning: Maximising SEO and User Experience

4 min read

the power of Collaborative content planning: Maximising SEO and User Experience

“Content is King!” The infamous statement from a certain Mr. Bill Gates back in 1996 (when Grunge dominated the airwaves and ‘curtains’ were still a...

Read More
How Reactive Digital PR Can Benefit Your Business

6 min read

How Reactive Digital PR Can Benefit Your Business

So firstly, how has Digital PR changed? A lot has changed in the last 10 years within the world of digital but also in the real world, there’s been...

Read More