The Frequent Changes to the Google Algorithm
The Google Algorithm has changed quite a bit over the years. That is to be expected with the constant evolution of technology. As to how many changes? There’s anywhere between 500 to 600 a year. With over 150 million active websites, there should be a reliable system in place to filter through it all. That is exact purpose of the Google Algorithm. As a business, it may be hard to keep up with all the changes that Google implements. Regardless of whether your business is small or large, understanding the algorithm changes will help to manage your website accordingly. Management is not as hard as it seems and with the basic knowledge of the updates, it’s easier to pick and choose which ones directly affect your site.
Let’s take a closer look at the importance of the Google Algorithm and popular updates.
Importance of the Google Algorithm
The Google Algorithm does all the work for you in regards to filtering websites. For example, say you are searching on Google for the best SEO company. That keyword search that you created is then filtered yielding numerous results. These results are the top ranked through Google. The placement would depend on the frequency that the user search terms appear throughout that website. This is why placement of keywords are essential to ensure that your page is ranked higher than it would be without them. This placement could occur in the page title, headings, and throughout a blog post. The higher rank increases visibility and traffic.
Looking closer at the ranking, Google’s PageRank system is one that has been patented. The PageRank system assigns a score to every search result. The higher the score, the higher the rank in the search engine results page (SERP). Not all votes are equal. A higher-ranking site has more power than a lower-ranking one. Scoring factors include strength of domain name, how and where keywords appear, and age of the links. Google does recognize websites that have been around for quite some time. Additionally, the number of web pages that link to the target page partially affect the score as well. Those links then account for a vote. On the contrary not all votes are equal. Higher-ranking websites have more power over lower-ranking ones.
Penguin was first introduced in 2012. Its purpose was to filter spamming sites throughout Google’s search results. There are sites that try to gain better ranking. They do this by using keyword strife, link schemes, and cloaking, among other factors. While Google already had a spamming system in place, the Penguin update would be implemented as a more enhanced version. There has been a total of six updates before the current version, . The current version is rumored to have a real-time update that would catch spam links faster. This update also devalues bad backlinks compared to the previous version that would penalize your site. With no future updates scheduled, Google will be able to make changes accordingly opposed to planning a massive update.
Pigeon was a U.S.-only local search update launched in 2014. The purpose was to provide more accurate and local search results. This local algorithm affects search results within Google Maps Search and Google Web Search. The distance is improved as well as location ranking parameters. Yelp noted an issue with Google. The complaint was that Google was suppressing their reach to improve their own. With the Pigeon update, Yelp-specific queries ranked as they should. There are other crowdsourcing platforms such as OpenTable, TripAdvisor, and Kayak that benefited from this update as well.
The Panda update was established in 2011. The update works as an adjustment system. Compared to Penguin that devalues spam in the ranking, Panda devalues it. This type of update was a trick to the tricksters that tried to maneuver their way to the top of the SERP ranking. Choosing to demote instead of devaluing is a better way to catch those that slip through the cracks. Those that have been successful in doing now have a harder time.
As of 2017, there is a new update in place by the name of Fred. It targeted sites low on content and heavy on ads. There have been reports that some websites have had 90% of their keywords drop in rank in SERPs. The key to adjusting to this algorithm is to watch the content you create. To have content enriched with keywords, Google may view them as a linking scheme. It is recommended to have fewer backlinks to trustworthy sources rather than dozens from various sites.
Mobile-first indexing is an algorithm that has no specific launch date as of right now. Most people use their mobile devices and there needs to be an algorithm to reflect that. The goal is for websites to be compatible for both desktop and mobile sites. There are still some separation between desktop and mobile sites. Having consistency between the two platforms will result in both mobile and desktop sites being ranked.
Google will have plenty more updates to add in the future, especially with their track record. Frequents show that faults are being fixed and not ignored. The is not a perfect system, but the updates in place help the algorithm function as efficiently as possible on a regular basis.