SEO has evolved a lot over the past couple of decades. It has gone to something that relied solely on the keywords on a page, to a sophisticated algorithm that no one outside of the search engine creators genuinely understands. Search engines like Bing and Google put a lot of effort into refining their algorithms and search results, so they are helpful to searchers. These efforts can take many forms, and website owners often get confused about which tactic explains their drop in traffic, and what they can do to fix it. At a recent SEO conference event, two experts with experience in the way Google and Bing handle these issues internally spoke about the way these companies handle algorithm updates, spam, and penalties.

At the SMX Advanced event in early June, a discussion was hosted by Frédéric Dubut, the lead of the spam team at Bing, and Fili Wiese, a former Google employee who worked on spam and manual actions. They started the discussion by reminding everyone that algorithm updates, spam designations, and penalties exist to benefit the SEO ecosystem. By removing poor quality content, they make the web more useful for everyone.

Bing and Google both demote and penalize sites. However, they have slightly different approaches. Wiese said Google uses manual actions to help educate publishers and site owners about the meaning of rules in Google guidelines. On the other hand, Bing uses manual actions more frequently to correct search results.

Whenever there is an algorithm update, some site owners complain that their site was penalized. However, algorithm changes are not penalties. Even if a website is negatively affected, they weren’t the target of the update.  Algorithms are about showing the best type of pages for the query.

You can think of an algorithm like a math equation. The search results are the output, and that’s controlled by the search engine. However, the inputs are controlled by publishers, webmasters, and content creators, so it’s not all in the hands of Google and Bing.

The discussion also focused on how Bing’s search engine is a little different from Google. Bing pushes out models to subsets of users and watches to see if these real-life searchers are satisfied by the search results. This process continues when Bing does algorithmic updates.

Generally, Bing would rather have an algorithm take care of dealing with spammy sites and not have manual actions. So it looks at how users react to search results, how the Bing judges like the search results, and what the actual ranking algorithm returns.

Google’s approach is slightly different. At the conference, Wiese explained that Google uses manual so they can fine tune the way a site is penalized. A website that breaks the guidelines could find itself with a penalty on the whole site level, subdomain, sections of the site, or even a page-by-page basis. In the most extreme penalty, Google can also choose to demote rankings, de-list sites, or another form of penalty.

What defines spam varies a little between search engines. According to Wiese and Debut, Bing has a 42-page rater’s guidelines document, whereas Google has a 166-page document. Neither company publishes these guidelines. Though Google has a more detailed rubric, both sites are generally fighting the same kinds of practice. Something that will hurt your SEO on Google will be equally damaging on Bing, and vice versa.

Though it may seem counterintuitive, manual penalties are better for sites than facing an algorithm update. With Google and Bing, you’ll be told what you did wrong, and you can submit reconsideration requests to recover from manual actions. Make sure to make significant changes before submitting the reconsideration to Google or Bing. It wasn’t a single thing that earned the penalty, so you shouldn’t expect a small change to fix everything somehow.

To learn more about SEO trends, read this article that discusses recent advice from Google on the effectiveness of social bookmarking.