The cryptic nature of Google’s algorithm for search can leave SEO experts guessing about the way specific actions will affect their rankings in the results. Often, the advice SEO marketers give is based on anecdotes and things that sound plausible, but aren’t proven. Given all of the data that the SEO community has acquired over the years, most of the more common suggestions are reasonably accurate. However, fear of negative responses from Google has people sharing the equivalent of SEO creepypastas. A Google employee recently updated guidance on how disallowed URLs affect SEO for the rest of the site.

While it may sound counter-intuitive at first, website owners may not want Google to index every page on their site. If there’s a page with content that’s problematic for Google, you can tell Google to not look at those URLs. This practice is known as creating a “disallowed URL.”  For example, it’s a good idea to turn your website search result pages into disallowed URLs since a search result page could look like keyword spam to Google’s algorithm.

The misconception involving disallowed URLs is that using them can have a negative impact on other parts of the site. Many people believe that since robots won’t crawl some of the pages on the site, other pages will get indexed more often. This line of thinking even lead to some people blocking pages they felt were less important, even if there was nothing wrong with them.

According to Google, this belief about disallowed URLs is wrong. Google’s Gary Illyes updated his original writeup on crawl budget last week to include some clarification about disallowed URLs.

Back in 2017, Illyes wrote a Google Webmaster Central post that addressed how crawl budgets worked. In preparation for turning the original blog post into a help center article, Illyes added the following question and answer about disallowed links:

“Q: Do URLs I disallowed through robots.txt affect my crawl budget in any way?

A: No, disallowed URLs do not affect the crawl budget.”

It may be just a couple of sentences long, but this update can help a lot of SEO marketers and website owners. When people were using disallowed URLs for the wrong purpose, they were unintentionally harming their SEO efforts. They removed content that could help them rank in the hope it would help the performance of the rest of the content. However, the material that is crawled isn’t affected by the disallowed URL. All the person has done is given its site like less valuable content for Google to work with.

Now that Google has clarified this issue, now is an excellent time to review your protocols for robots and crawlers. If you have unnecessarily blocked URLs, you should let Google crawls them. Furthermore, if it’s been a long time since your last SEO audit, you should have someone check to ensure your site isn’t operating on outdated strategies.

For more information about Google and SEO, read this article on Google’s new limit for top search results from the same domain.