LeadsNut

What are the procedures to bring a website out of Google penalty?

Google penalty on any website may cause a lot of loss to it. Its ranking may decrease, the organic traffic may be reduced substantially, or even Google stops it from getting ranked in a search result altogether. This drop-in traffic is revenue loss or the website becoming less profitable than before.

Due to certain manipulative malpractices which violate Google’s guidelines and recommendations, a website may be penalized. Also, sometimes, even after following Google guidelines a website may be penalized. Some penalties may be caused by externalities not under our control, for instance, Negative SEO (Search Engine Optimization).

Here, Negative SEO means the use of unethical and black-hat techniques to sabotage a competitor’s ranking on a search engine. Regardless of what factors caused a website to be penalized, one has to take the website out of Google’s penalty.

Identify the type of penalty.

First, determine which kind of penalty has affected the website. A fall in traffic doesn’t always imply that the website was penalized. It may be caused by technical issues such as server problems, robots.txt blocking access to Google’s crawler, improper redirects, etc.

A traffic drop may also be experienced if the website’s visitor numbers depend on the time of the year or month. For instance, a website selling woolen garments may find a huge drop in traffic during the summer months.

Other things which indicate a penalty on a website are dropping of traffic on individual pages and/or specific keywords, a significant drop in overall traffic, a number of pages getting de-indexed, or an entire site getting de-indexed by Google, among others.

Penalties are mainly of two types: A. Manual Penalty and B. Algorithmic Penalty. Manual penalties are caused by Google’s manual quality raters. Algorithmic penalties are caused due to detections by Google’s algorithm to detect spam and other violations.

  1. A. Manual Penalty. Google’s algorithms may not be able to detect all kinds of guideline violations in an accurate way. Sites flagged as potential violators are evaluated by Quality Raters (QR). If violations are indeed found, QRs could penalise a site. Unless one deliberately malpractices, such manual penalties are not imposed on the websites.

To check for a Manual Penalty one has to log into Google Search Console (GSC). One can see a message if the website has been penalized. In the Main Account dashboard “All Messages” or click on “Messages” on the side menu for any individual site. Also, as an alternate one could look in the “Search Traffic->Manual Action” section of the menu.

Manual penalties may be of many types; some may cover the entire site and some may be specific to certain pages. Older manual penalties might have expired as different manual penalties expire over different time periods. However, expired penalties don’t mean that they may not be levied again and hence they must be resolved.

If all is fine, then the message “No Manual Webspam actions found” is seen.

  1. Algorithmic Penalties. If one is not able to see any manual actions in GSC, it probably means an algorithmic penalty. Since there are no clear messages on such kind of penalties, a bit of investigation needs to be carried out in order to find out the cause of such a penalty.If the site’s search engine health is continuously monitored it would be easier to figure out which algorithm has affected it. This is good for detecting current or new algorithms which have hit the site.

Even a penalty that occurred in the past can be figured out with the help of GSC and/or Google Analytics (GA). For traffic issues, GA is helpful and for indexing issues, GSC is used. For traffic issues, in the GA select a wide date range in the Audience Overview panel. From the ensuing graph, one could obviously see the date on which the website was hit by a penalty.

For indexing issues, in GSC, go to “Google Index->Index status” to notice the rough date range when the drop began. To determine which algorithm caused the penalty, one can use the Penguin Tool of Google. It will then line up a graph of the website’s traffic with lines that indicate the points in time when the algorithms were run.

Once, it has been determined whether the penalty was manual or algorithmic, the next step should be to understand why the penalty was imposed. This way one would be able to identify which violation needs to be fixed.

Understanding the penalty

A lot of penalty types are there. Some common penalties are being discussed here. (The next three are manual penalties)

  • Unnatural links. Google uses backlinks (the links which point to our website) to our website as a measure of its quality. But if Google sees that such links are ‘unnatural’ i.e. bought or manipulated or done in some favour such as creation of reciprocating backlinks, then the website would be penalized.People may also create ‘unnatural’ backlinks to bring down the rank of their competitor by making irrelevant links to the website. One could also get manual action for having “Unnatural links from your site”. This type of penalty may be for the whole website or may be limited to a few pages.
  • Thin or Duplicate content. Google tries to return best and original, well written or created content when a user does a query. Duplicate content or thin content, which produce little or no value is not appreciated by Google. Such content may include automatically generated content, thin affiliate pages, low quality guest posts etc. This may also incur manual action penalty from Google. Such things may a website’s traffic from a mild to severe level.
  • Spam can include a wider variety of things such as cloaking, automatically generated content, user generated spam (cause by spam comments or forum profiles) etc. Cloaking is a technique where different content is shown to a search engine bot and a user. Also, “Spammy Freehosts” (Host which primarily hosts spam sites) are penalized. If one’s site is hosted on such Hosts then it would also be penalized.

The major algorithms which penalize a website are (Although Google tries to keep algorithms as secret as possible)

  1. Panda Algorithm. It is an algorithm that looks at content quality in order to prevent low quality or ‘shallow’ content from appearing in search results. The reasons for penalizing may be- poorly written content, ‘shallow’ or too brief a content to be valuable, substantially duplicate content or the content which doesn’t adds up much value. Content having keyword-stuffing or having hundreds of keywords centred on long tail keywords get hit the worst.
  2. Penguin Algorithm. Penguin mainly concerns itself with a site’s backlink and is designed to find unnatural link patterns. It checks link velocity (gaining of natural links over time), link quality and link diversity.Penguin is page specific; therefore, the traffic generating page (which might have used unnatural backlinks) gets hit and other pages may remain unaffected. It may also discount some unnatural links thereby making such links to now count nothing in determining rank.

Since Penguin is a massive algorithm; it may take weeks to run it fully for a website. Hence if the drop in traffic is after a Penguin run had started, it could still be possible what might be penalizing the website.

Manual action as well Penguin might also both hit a website at the same time for unnatural links.

Other well-known updates such as Pigeon, Payday, Mobilegeddon, etc may also have an effect on search engine traffic.

Identifying and fixing all relevant problems.

Once it has been identified that what has hit the website, it becomes easy to resolve it.

Issues such as unnatural links emanating from the website can be tackled by reducing such links, especially the links of low authority sites. If the site has spam in the form of user-generated content and comments, delete it and in the future ensure that such spam does not accumulate on the website.

  1. Unnatural links. Fixing such a problem is a time consuming job. Start with finding all the unnatural links which point to our website. GSC could be used to get them but it may fetch a partial list, but other paid services to get a complete list. In such tools, search your domain name and click for “inbound links”. This would fetch a list of all the backlinks. Download the list in the form of a CSV file which could be easily opened in MS Excel.

Now, the links which are unnatural need to be figured out. If the website is small and not very popular, the number of links could be in 10s or in 100s, which could make it easy to flag ‘unnatural’ links. If the website is big, then there could be a few hundred or even thousands of backlinks to it.

In such a case, various tools (which are paid ones) could be used to flag unnatural links. Such tools use their own formulas to analyze the links. These tools have other features which help to easily identify penalties and make our email outreach better. (Email outreach will be explained shortly)

These tools may catch good or bad links but these are not 100% accurate. Hence they could be used to narrow down the best and worst links. Manual inspection becomes the options one has thought is cumbersome.

Some of the links could be easily spotted, for instance, links that are contained in websites that are poorly written or from a site that looks very generic. All such links could be marked down but even this exercise is subjective. Not all spurious links need to be removed to come out of manual penalty, as this identification process cannot be perfect.

In order to remove the identified links, the owner has to be requested to remove such links which point to our website. For this, contact information, i.e. name and email address of each site owner is required (this is where the aforementioned email outreach makes our task easy).

This can be done via manually finding out detail for each link’s owner or using some other tool to do it. Once all the contacts (which may be hundreds or thousands in number) are collected, a proper email requesting the owner for links removal needs to be written. This may help remove a lot of unnatural links.

The other method is submitting a disavow file to Google. This file contains the list of domains that we are asking Google to ignore. Put one domain per line in the file. This step would be crucial in getting the penalty removed.

  1. Fixing thin content. If the website has content which doesn’t make much sense or adds little or no value, it is better to rewrite the content or delete the portions which are irrelevant. Duplicate content could be fixed by either deleting it or adding a canonical link.

Duplicate content could be found out by using GSC. Go to “search appearance->HTML improvements” to find out issues, if any.

  1. Fixing a Panda Penalty. Such penalties are algorithmic, so one might have to wait for until the algorithm runs again to find out if one’s efforts were successful.

The most common things to try for removal of a Panda penalty are:

  • Getting rid of any thin or duplicate content (as discussed earlier)
  • Creating unique Meta tags and titles for each page.
  • Skim through the articles for any awkwardness; forced keywords could be replaced by appropriate synonyms.
  • Eliminate Cloaking of links. [Cloaking is a way of showing different versions of a website to a Google spider or bot and to a user].
  1. Fixing a Penguin Penalty. The first thing to do is to remove all unnatural backlinks to the website. Removing such links through a tedious and laborious process (already discussed) would resolve a lot of problems.

To remove low-quality links, disavowal could be used (as discussed earlier). Although, most low-quality links decay or lose value over a period. It would be better to invest time and resources to attract high-quality links as this algorithm is based on some kind of ratio between low and high-quality links.

For manual penalties, ask for reconsideration.

If one’s website is hit by a manual action, a review or reconsideration request could be submitted. This request must be put when one is satisfied that enough action has been taken to fix the problem and that it would not occur again.

Once the request is submitted to Google, a quality rater attends it. They would then revisit and re-evaluate the website. Upon their satisfaction, the manual action would be lifted.

To do such a request, in the GSC visit manual action under “Search Traffic->Manual Actions”. For each manual action that struck our site, one can click the “request a review” to bring up a form. In this form, fill in the form with an adequate explanation of the action taken along with some documentation as proof that the action was actually taken.

Give all the possible explanations for what caused that issue in the first place. Even try to tell stepwise how the issue got fixed (this is where to present the documentation). Finally, tell Google what steps would be taken to avoid such manual penalties in the future.

Avoid getting penalized again, in the future.

To avoid getting penalized again and reduce such chances, some steps could be taken

  • Performing regular audits. Perform technical SEO audits as well as content audits regularly. Such audits involve monitoring links (in order to disavow, for instance, spam backlinks), regularly checking GSC messages, checking for duplicate content, disallowing spam-blog comments.
  • Keep an update and an eye on SEO sites. Since Google evolves and changes its quality guidelines on a regular basis, it is important to check SEO sites. This helps in keeping track of any big changes which might get our website penalized in future. Pro-activeness would help in avoiding penalties and would save the trouble of fixing them in the future.
  • Avoid shortcuts. Building an organic traffic over a website takes time and effort. It is better to avoid black-hat techniques or cheap tricks in order to gain traffic, as this may have devastating outcomes for the website in the future. It is tempting to use cheap techniques, but better it is to avoid them altogether.

Leave a Reply

Your email address will not be published. Required fields are marked *

Call Now