Tag Archives: Google Algorithm

The New Quality Algorithms – Panda And Penguin

Google-Panda-Penguin

You might find it hard to believe but Penguin & Panda are not penalties, rather they are algorithms. Google is very adamant that we should not call these algorithmic changes as penalties. Penguin or Panda are something that is either on or off. It is something where we find the signals that we have and we endeavor to find the correct way to adjust for that.

What Is A Google Penalty?

It’s true that Google penalties do exist. Google can actually penalize a website for things like outbound unnatural links, inbound unnatural links, pure spam, thin content and more. The most common reason for a website to be manually penalized is because somebody has reported the website to Google. Majority of people also believe that Google manually analyzes the top websites in various competitive search results also; however we are not sure about it.

In case your website has a manual penalty, you will get to see this in your Google Search Console, earlier called as ‘Webmaster Tools’. To check if your website has a manual penalty, go to Search Traffic – Manual Actions. You might see a penalty like this –

11

And, if you haven’t received penalty, you will see this –

12

Note – You will not always get to see the proof of penalty in the ‘messages’ section of Google Search Console. It will happen, only you were added to Google Search Console for this website before the website was penalized, then you must see a message, which appears like this –

13

What is an Algorithmic Filter?

Google’s algorithm is truly a complex topic to discuss. There are various parts of the algorithms, which are continually evaluating websites and changing their ranking depending on what they are seeing. For example, Google’s keyword stuffing algorithm re-evaluates your website every time, whenever Google crawls it. There are several more parts of the algorithm, which are called as filters. Filters are alterations, which happen only when Google decides to apply them. Panda and Penguin are filters. If you website is evaluated as a low quality website then Google will change the algorithms to take down your website ranking. If your website is loaded with a lot of issues then it can be affected rigorously. If it has very few issues then you may experience a least deduction.

Shall we call Penguin & Panda Penalties?

Go to Google search for ‘Penguin Penalty’ or ‘Panda Penalty’ and you will get to see some brilliant, renowned SEO professionals using this terminology. However, if you are someone who really knows Google’s algorithms, it is probably best to refer Penguin and Panda as algorithmic filters and not penalties.

Facts You Must Know About Google’s Truth Algorithm

Google Truth Algorithm

By the time, you must have got to know about Knowledge-Based Trust, a Google research paper that defines a technique of scoring web documents as per the accurateness of facts. Knowledge-Based Trust has been mentioned as the Trust Algorithm, a new way to allot a Trust Score to weed out the website, which contain incorrect information.

As per the article published in New Scientist, Google desires to rank websites based on links and not on focus. The idea is to recognize the key facts in a web page and rank them as per their accuracy by assigning a trust score.

The algorithm researchers are conscious to note in the paper that the algorithm doesn’t punish websites for lack of facts. The study shows that it can find related web pages with low PageRank, which would otherwise be ignored by current technology.

In present time algorithms, links are a sign of popularity that entails the authority in a specific topic. However, popularity doesn’t always mean a web page comprises correct information. A perfect example may be celebrity gossip websites. Getting simple popularity signals and building an algorithm, which can understand what a website is about a path that search technology is heading in today’s time.

As per the research, there are minimum 4 issues to overcome before Knowledge Based Trust is prepared to be applied to billions of web pages.

Inappropriate Noise –

The algorithm utilizes a technique of recognizing facts that checks three factors in order to decide it. It considered them as ‘Knowledge Triples,’ containing a predicate, subject, and an object. A subject means a ‘real time entity’ like places, things, or people. A predicate explains a quality of that entity. As per the research paper, an object is ‘a string, an entity, a date or numerical value.

Extraction Technology Needs To Improve –

KBT is incapable to extract data in a sensible way from the website outside of a regulated environment without being flooded with noise. The technology referred here is known as Extractor. An extractor is a system, which recognizes triples within a page and allots confidence scores to those triples. This segment of the document doesn’t openly mention what the issue with the extractor is; it just quotes “limited extraction capabilities”. In sense to apply KBT to the web, extractors have to be able to recognize triples with a high conviction of accuracy.

Trivial Facts –

KBT doesn’t sufficiently filter trivial facts to keep them aside and not utilize them as a scoring signal. The research paper utilizes the example of a Bollywood website that is on almost every page, which states that a movie is shot in the Hindi language. That is recognized as a trivial fact, which should not be utilized for scoring trustworthiness. This drops the accuracy of the KBT score because a web page can get an unnaturally high trust score built on trivial facts.

Copy Content –

The KBT algorithm can’t sort out website containing facts duplicated from other websites. If KBT can’t sort duplicate content, then it is quite possible that KBT can be spammed, just by duplicating facts from ‘trusted’ sources like Freebase, Wikipedia, and many other information sources.

Key Steps To Improve Website Loading Speed

How To Increase Website Load Speed?

Throughout 2014, Google had released numerous algorithm updates. Primarily, algorithms targeted keyword stuffing, spammy backlinks, and domain ownership, however, those factors slowly changed into the Google criteria, which we know and follow nowadays.

Now, after a huge span of time, the company has started to improve its search algorithms, website loading speed has become one of the major determinants of your website’s ranking position and value, making it one of the prime necessities for optimization and a prominent must for website owners who want to get ranked well. How your website is doing in the wake of Google’s numerous algorithm shifts?

Let’s check , why page speed has become such an essential ranking factor and how you can develop your website speed.

 

Google’s Expert Plan

Majority of SEOs would agree that their struggle to cope-up with Google hasn’t been easy. Actually, it’s been downright terrific at sometimes. It is expected, as Google desires to improve the user’s experience by pointing out what most apt websites connect to a specific search term and that in the end making things more hassle free.

How To Improve Website Load Speed?

If your website is going through a low download speed then below tips will be helpful for you to improve site speed and rank well in search results.

 

Decrease on-page components –
Between stylesheets, scripts, and flash, numerous processes can appear behind the scenes, which will prominently slow down the page’s load time. If you mix style sheets or replace photos, images with the HTTP, CSS requests, which make on-page elements work, won’t have a chance to slow down your website.

 

Combine Large Pages –
Over the past few years, Google has made it clear that SEO best practices contain the use of long-form videos, content and shareable media to capture more customers and rank well in SERPs. Unfortunately, all this surplus content can cut off the load speed. Try to combine your larger pages so that they will occupy less space and consume less bandwidth whey they load.

 

Remove unnecessary plugins –
Specifically, if you are using WordPress, operating numerous plugins contributes to slow page load speed. Even though, their ease of usage and convenience make plugins an eye-catchy option, using lot many will cause your website load speed to plummet and result in a weak user experience. Take off the plugins, you don’t really need.

 

Use Browser Catching –
Keeping browser caching ‘ON’ will save all essential elements in a viewer’s hard drive that results in a speedier load time, when the visitor comes back. A lot of webmasters fail to utilize this tool and that boosts to slow down website loading time and resulting in an inconvenience to visitors.

 

Remove Hosting Plan
Sometimes, undependable hosting providers can leave a bad impact on your website site speed. Majority of hosting companies offer packages in a one-size fits all format, however, the assumption that one form of hosting service will function best for all websites is completely unreal.

Can Getting Good Links Help You To Escape From Penguin?

Google Penguin UpdateDuring the recent Webmaster Central Help Hangout, John Mueller, a Google employee was asked about a hypothetical situation, where a website was badly affected by Penguin on which he stated that you can escape from the grip of Google’s Penguin algorithm by generating sufficient good links. The weightage, of good links can save the website from getting badly affected by bad links. Could this website possibly come out of Penguin if they didn’t disown or clear the bad links?

According to Mueller, the new links would definitely prove helpful to the website. He stated that Google’s algorithms consider whether the things are moderating and moving in the right direction or not. Then he suggested, in the hypothetical condition, where someone who is not aware of any of this and they realized that they had done something wrong in the past and they are trying to correct it in the future then that’s something our algorithms will take up on, which will be useful for us as well.

In other words, if Penguin observes that the good signals have started to outweigh the bad ones then things can definitely start to improve.  Mueller claimed that anyone with bad links must use the disavow tool and not just depend on getting new links. Theoretically, Penguin issues can be better if a website can attract new links.

Let’s Check Out A Few Examples of good links over bad links

1.      Example

In this first example, the site is a worldwide known brand. They offer a great product range; due to that people frequently mention their products and link into their own websites. However, a few years back in April 2012, when Penguin first came out, the company got engaged in some link buying process, which dragged them into trouble. Then they stopped building their own links, however, the website continued to receive a good number of new and completely natural links. The organic traffic stayed pretty flat, even though they were attracting new links. Since then the company started working on cleaning up extensive links, which showed a good improvements by the launch of Penguin 2.1

example 1

  1. Example

In the second example, is quite neat. In April 2012, the site was badly hit by Penguin. The site owner did a very comprehensive link cleanup, however, like the previous example, Penguin concealed the site traffic so that it won’t climb up above, which is called as ‘The Penguin Ceiling’.  On July 2014, the site owner saw a sudden rise in traffic of his site. He thought that Penguin has been refreshed and finally he has escaped from it, but it was not like that. He had started receiving new links from various press websites, however, it didn’t help him because by that time Penguin 3.0 update was hit on October 17,2014. Website owner’s efforts to clean up the links and newly received natural links finally paid him off. Even though, the site didn’t recover from the pre-penguin levels, still it got free from ‘Penguin ceiling’.

example 2

Both the examples prove that gaining new links is not enough for a website to come out of Penguin. However, some of the website cases with mild Penguin impact can possibly observe the improvements even though they don’t do link cleanup. That doesn’t mean you should completely overlook the power of link cleanup. Link cleanup is essential for Penguin recovery.