Academy of Internet Marketing
Contact us
Модальное окно

The Complete Guide to 2016 SEO (demoting factors)

Josh Bachynski has 16+ years of experience in SEO, digital marketing, and branding.


From the previous lecture Complete Guide to 2016 SEO (positive factors): the above image shows what should be done in order to get better rankings by Google.

However, demoting algorithms of Google will punish those who go too far or do too much – they will put you in ranking jail. Below are the most widespread demoting factors that can get a website down in rankings.


  1. Panda – the single biggest algorithm moving forward. Quality is the most important thing for Google, and Panda is the best algorithm to assess quality. It does so monthly and unannounced. It will demote entire site if it deems it low-quality. More on the matter by Josh Bachynski:


Panda is based on usage metrics & onsite factors (2011 version was based on onsite factors only). Pay attention: Panda is automatically updating in 2016.


How to avoid being demoted


  • Satisfied users: conversion of over 60 percent.
  • Professional look of the site.
  • Legitimate business signals listed (address).
  • Outlinks to authorities.
  • Content shared, liked, reviewed, and rated.



  • Have a boring blog! (Unless it’s Forbes magazine-level of quality).
  • Have superfluous keywords.
  • Have a bad reputation.
  • Have too many ads above the fold of the webpage, or pop-ups.
  • Have affiliate links – better put them in iframe.
  • Have URL parameters, especially session ID, in navigation/canonical.
  • Have a bad design.


    1. Penguin – ‘webspam’ or ‘over-optimization’ algorithm. It works on both onpage and offpage over-optimization, but is more link-based – it looks at webpages that link to you. Penguin also pays attention to relation between exact match query (EMQ) of backlink page and target page. This includes your title, attributes, URL, content and anchors – if a website has too many keywords in those sections, it would be most definitely hit by Penguin.


The algorithm is punitive, and disavow might not always help. Making a disavow file does not do anything for current or future Penguin check – this has a lot of evidence. It does, however, cause some link deletion, which might make the site better. Some cases evidence that making disavow files got their rankings worse.


How to remove?

  • Remove your onpage keyword repetition.
  • Try to disavow, but better delete 80%+ of your EMQ backlinks (titles, content, URLs, attributes).
  • Either 404 or delete the target page.


Again, do not disavow ahead of time – much evidence suggests the site might get hurt and rankings go down.


  1. ‘Paid’ and affiliate links, advertorials – looked for by Google’s manual team (12,000 persons in India in 2012). The algorithm bubbles up suspects and acts on them – about 70-80% of spam form submissions get acted upon.


Avoid sidebar and footer links, directory lists, blogroll, sitemap, compilation of sources, style links, both onsite and offsite. At best, these links are devalued; at worst they trigger manual action.


    1. Manual actions or ‘penalties’ – come from the Manual Web Spam Team. There are six or more types of manual actions listed in Search Console under ‘Search Traffic’. You must have a Search Console installed in order to see these penalties.


Two most important and most common of these are ‘unnatural link notices.’ There are two kinds of penalties against them: action vs. site, and action vs. links.


If you have action vs. site, you have to delete the links they point to, show good faith, disavow the rest of the links that could not be deleted, and submit for reconsideration. If you’re a first time offender and you make them believe you’ve cleaned up and will never do it again, you might get out. If you’re a second or third time offender, give up – delete your website and make another one.


With action vs. links, just do nothing. It’s just a warning for you to change, not to do anything – your rankings might go down even further.


  1. Duplication. There is no such thing as duplicate content penalty. However, there are problems with duplicate content. First, it forces Google to choose which page to rank – they always choose the one with the legacy trust and authority. Second, Google’s John Mueller has admitted they trust sites with duplicate content ‘less.’





  1. Bad merchant algorithm – is used to check sentiment analysis of ecommerce sites. It studies people’s sentiment about the site, and if it turns to be mostly negative (more than 50 percent), the algorithm is going to demote the site.


The best way to beat it is to have good sentiment about your site: make sure all ratings and reviews are positive, delete all the bad reviews, or change your domain name.


  1. Negative SEO – there are many kinds of it. Due to Google’s algorithms, it’s easier to do negative SEO, although they do take some measures to protect against it.


  • Link-based negative SEO – spam backlinks. They are picked up by Penguin.
  • Other negative SEO – can be reported to John Mueler over spam form; or you can make a DDOS attack; make 1,000 fake URLs on their site with keyword links; write bad reviews, attack their backlinks, scrape their articles, spam their comments, fake bad users.



Protect yourself:


    • Don’t just satisfy your users – amaze them. Being super-amazing will protect a site from negative SEO.


    • Older ‘spam’ domains are more susceptible, so consider sub-branding with a sub-domain or launch a new domain. This will increase your protection and put the eggs in several baskets.


    • Don’t rely on signals of yesteryear – they know about them. Have the signals they do not know about. E.g., a website with zero links, which is still ranked – due to conversion, brand searches, reviews, keywords in right places, social profile activity, etc. (see the picture below).



Irina Titova



  • josh bachynski (SEO)

    This is a good summary of what I covered for the demoting factors – if anyone has any questions, or needs SEO help, by all means contact me!

  • Jeannie Hill

    Very interesting Irina Titova and Josh Bachynski – thank you. It may be the best, concise, solution suggestions that I have read on how to combat negative SEO.

    A denial of service attack or DDOS has been troublesome for some sites I manage. Right after some significant and hard earned gains, an attacker sent such high traffic volume to a website that it went down. But, it is more likely that the negative SEO attacker will prefer to slow a site down to the extent that it’s still in the search engine listings, but Google almost ignores it due to being a slow site.