You probably didn’t realize that animals would affect your website
and your business. The three Google search algorithms having the biggest
impact in the past three years are a bear and two birds: Panda,
Penguin, and Pigeon.
Google’s algorithms are designed to emphasize quality websites while pushing down, in search results, low-quality sites. Google’s efforts are admirable, especially since Google is the most-used search engine.
What is harder to deal with is that a small group of people — Google employees who manage search algorithms — often cause a lot of hardship. Changes to search algorithms can greatly reduce business revenue and increase expenses. Revenue is typically reduced when a website’s ranking (and, thus, traffic) is lowered. Expenses can increase when businesses spend money to make corrections, to address the algorithm changes.
Let’s take a look at each of Google’s pets — i.e., algorithms — to better understand them and their impact on your business. Note that for purposes of this article, I am defining “low rank” as a ranking position in search results of 10 and beyond.
The Google Panda U.S. patent states there is a ratio with a website’s inbound links and search queries for the website’s brand, products, and services. The ratio is used to create a modification factor affecting the entire website, not just individual pages. If the page fails to meet a certain mathematical threshold, the modification factor is applied and the website would rank lower in the search engine results page.
When we talk about low-quality websites, there are a few important factors to keep in mind.
Another consideration is to secure the services of writers and or marketers who can interpret the findings and make changes to help your website’s quality. A technical website developer is not necessarily the right discipline you need in this case.
A material update to Panda occurred in May 2014, with Panda 4.0. This algorithm is here to stay and keeps getting updated.
Google implemented Penguin in April 2012. It is aimed at websites that violate Google’s guidelines by using so-called black-hat techniques — artificially increasing the ranking of a web page by manipulating the number of links pointing to it.
To avoid getting caught in the Penguin algorithm trap, you have to know if the websites linking to you are of quality. This means someone has to know how to determine the quality of websites based on metrics such as Google Page Rank, Alexa Traffic Rank, volume of pages indexed, and volume of links on the web. Quality over quantity is the key, which means someone actually has to research the information.
Using Google’s Webmaster Tools you can find out what websites are linking to you. Once you have that information you will be able to research metrics of these websites using publicly available automation tools to provide you their Google PageRank (a weighted average value from 0 to 10; the higher the better) and other information pertinent to determining their value.
Once you know which websites are of low quality you have two options.
The Penguin algorithm will likely be materially updated by the end of 2014, with version 3.0.
The changes are visible in the Google Maps and Google web search results. It impacts local businesses, which may notice an increase or decrease in web site referrals and leads from this change.
Unlike Panda and Penguin, the Pigeon algorithm does not impose penalties. Its purpose is provide more accurate and relevant local search results.
Local business owners should consider these three points.
Source URL: http://webmarketingtoday.com/articles/113316-SEO-Understanding-Pandas-Penguins-and-Pigeons/
Google’s algorithms are designed to emphasize quality websites while pushing down, in search results, low-quality sites. Google’s efforts are admirable, especially since Google is the most-used search engine.
What is harder to deal with is that a small group of people — Google employees who manage search algorithms — often cause a lot of hardship. Changes to search algorithms can greatly reduce business revenue and increase expenses. Revenue is typically reduced when a website’s ranking (and, thus, traffic) is lowered. Expenses can increase when businesses spend money to make corrections, to address the algorithm changes.
Google’s algorithm changes are designed to emphasize quality websites while pushing down, in search results, low-quality sites.We could spend a lot of time fretting about these changes or we can push forward. Accepting the changes and objectively making corrective decisions is a better use of time. Moreover, web marketing has no endpoint and change is inevitable.
Let’s take a look at each of Google’s pets — i.e., algorithms — to better understand them and their impact on your business. Note that for purposes of this article, I am defining “low rank” as a ranking position in search results of 10 and beyond.
Panda Algorithm
Google released Panda in February 2011. The algorithm was aimed to make low-quality or thin sites less visible in search results and rank them lower.The Google Panda U.S. patent states there is a ratio with a website’s inbound links and search queries for the website’s brand, products, and services. The ratio is used to create a modification factor affecting the entire website, not just individual pages. If the page fails to meet a certain mathematical threshold, the modification factor is applied and the website would rank lower in the search engine results page.
When we talk about low-quality websites, there are a few important factors to keep in mind.
- Duplicate content within your website is an important issue that is not only about meta tags, but the visible content as well. Having web pages with the same title tag and meta description will hurt your rankings.
- The visible content has to be of value to the visitors. Keyword density percentages are important as well as the web page’s readability score. Flesch-Kincaid and Flesch Reading Ease are two methods of determining readability scores. The more the web page can cater to the general public, the more likely it will meet a quality level based on a mathematical equation.
- Fresh content that is helping to build the overall density of pages of your website is important. This is where “thin websites” that do not have enough content or for that matter websites that do not have fresh pages are negatively impacted.
Another consideration is to secure the services of writers and or marketers who can interpret the findings and make changes to help your website’s quality. A technical website developer is not necessarily the right discipline you need in this case.
A material update to Panda occurred in May 2014, with Panda 4.0. This algorithm is here to stay and keeps getting updated.
Penguin Algorithm
Google’s Penguin algorithm addresses inbound link quality.Google implemented Penguin in April 2012. It is aimed at websites that violate Google’s guidelines by using so-called black-hat techniques — artificially increasing the ranking of a web page by manipulating the number of links pointing to it.
To avoid getting caught in the Penguin algorithm trap, you have to know if the websites linking to you are of quality. This means someone has to know how to determine the quality of websites based on metrics such as Google Page Rank, Alexa Traffic Rank, volume of pages indexed, and volume of links on the web. Quality over quantity is the key, which means someone actually has to research the information.
Using Google’s Webmaster Tools you can find out what websites are linking to you. Once you have that information you will be able to research metrics of these websites using publicly available automation tools to provide you their Google PageRank (a weighted average value from 0 to 10; the higher the better) and other information pertinent to determining their value.
Once you know which websites are of low quality you have two options.
- Send the website a message and ask it to remove the link to yours.
- Use Google’s Disavow Tool in Google’s Webmaster Tools to ask Google to not put this website into consideration as a link to your website.
The Penguin algorithm will likely be materially updated by the end of 2014, with version 3.0.
Pigeon Algorithm
Search Engine Land, the online magazine, reportedly first used the name “pigeon” for this new algorithm, which was implemented in July 2014. It designed to provide more useful and accurate local search results. Google stated that this new algorithm improves its distance and location ranking parameters.The changes are visible in the Google Maps and Google web search results. It impacts local businesses, which may notice an increase or decrease in web site referrals and leads from this change.
Unlike Panda and Penguin, the Pigeon algorithm does not impose penalties. Its purpose is provide more accurate and relevant local search results.
Local business owners should consider these three points.
- So-called website authority is an issue now because of Google’s search ranking signals. Map listings (i.e., local business listings) are now tied to your website’s authority. This in turn goes back to how Panda and Penguin algorithms are affecting your website.
- Local business listings should be pin-code verified, updated, and continually maintained. Adding photos and videos and responding to ratings and reviews is important. Positive, legitimate ratings and reviews can only help.
- Citations are important, which means you need to pin-code verify your listing at other sites beyond Google (such as Bing, YellowPages.com, Yelp, and the like) and be sure they are updated exactly the way your listing is at Google. Use automation services to help get the word out to over 200 other websites, which helps with the citation-building process.
Source URL: http://webmarketingtoday.com/articles/113316-SEO-Understanding-Pandas-Penguins-and-Pigeons/
No comments:
Post a Comment