Skip to content
web design

Our Blogs

8 Types of Negative SEO to Watch Out For

Blog Image

SEO has always been a way to help improve your search engine results. However, there are ways out there to cause the opposite effect. Negative SEO is used by fake/bad SEO companies to cause websites to drop. This can also be done by a competitor. In the same fashion, as standard SEO, there are many ways in which negative SEO can be implemented.

Some people don’t believe negative SEO exists. They think it is something like dark magic and that they are safe. It is these sites that then get affected by negative SEO. They lose positions, lose traffic and it becomes a hard task to get everything back to the way it was.

It is surprising how easy it is for a company or competitor to use negative SEO tactics. However, there are ways for you to be able to protect yourself.

What is and isn’t negative SEO?

Negative SEO is best described as when someone goes through certain activities that are aimed at lowering a competitor’s rankings on search engines. These will often not be on-site activities. An example of on-site negative SEO would be to add many duplicated pages that are hidden away from the main navigation.

Negative SEO however is not always behind sudden drops in your sites rankings. Due to the many factors Google looks at when ranking a website, there could be many reasons why your site has dropped. You should first factor out anything else it could be before making the decision that it is down to negative SEO.

Negative on-page SEO

Let’s start with the on-page negative SEO techniques. These are difficult to implement and can only be used if someone was to hack into your site. If a hacker attacks your site, any of the following SEO threats can be implemented.

Modifying your content

This is very common way for hackers to be able to implement negative SEO into your site. This isn’t only by changing text on the site. This is far more extensive. This can be a very subtle change that you might not necessarily spot. Spammy content is added to the site and the hacker will often hide it from the normal display. This is through altering the code by adding a “display:none” in the HTML. This means that what they add, be it a link or text, it will be hidden from site unless you check the code.

Another version of this is where the hacker redirects pages on your site to lead elsewhere. This doesn’t necessarily have to be their own site or something that is relevant. They could redirect the page to land on a pornographic site if they felt like it. If Google spots this link before you do, it can then penalise you for directing people to malicious sites. This will cause a major drop in your sites rankings that are very hard to get back. When something like this happens, it leaves a horrible reputation on your domain. This reputation is very hard to remove and/or improve.

However, there are hackers that will redirect the pages on your site to land on theirs. This is mostly the case for high authority businesses. They will target more authoritive businesses with more links compared to a small business. This is because they will then be able to drive traffic to their site or improve their own site rankings due to the high-quality link they just created.

How do you avoid this?

Hacking is hard to avoid now-a-days, but there are ways to ensure that you can combat any changes a hacker makes. Using tools like WebSite Auditor developed by SEO PowerSuite is one of these ways. This tool will show you the number of links that are currently internal and external on your site. This must be constantly checked just in case there is a massive increase in the number of links.

Once you open WebSite Auditor, you are asked to input a domain name to get started. If you have already got the application, all you need to do is press ‘New’ to start a new project.From your site being mobile responsive to any 4xx or 404 errors; this tool will go through all the checks it needs to. Once the first check has been done, you can check this regularly. By clicking the ‘Rebuild Project’ button at the top right of the application, you can then redo the checks. This allows you to check the number of outgoing links and spot changes that might otherwise go unnoticed.

The links are not listed on the main page. If you wish to look at them in more detail, you will need to head over to the All Resources tab on the dashboard. From here, you can look at the External Resources section on the bottom left. Here it will list the number of links it found. This is what you will need to keep a record of. If, during a future check, this number increases by a dramatic amount, you can act. Where the links are found are also listed on the page as well.

Getting the site de-indexed

This is a devastating negative SEO tactic that can cause destroy your rankings. It is very harmful and can ruin any company’s SEO campaign. This is as simple as a small change in the robots.txt file. It is one alteration where a disallow rule is added. This is all it takes to tell Google to ignore pages or even the whole site.

You can find stories of this happening online and is something that needs to be checked regularly. Examples of this happening are where people have fired their SEO agency that they weren’t happy, whom of which then went and added a Disallow: / rule into their clients robots.txt files. There are agencies out there like this which is another reason to be very careful!

How do you avoid this?

Again, through using tools, you can make constant checks that keep your site safe. SEO PowerSuite have also developed a Rank Tracker. There are also many other online tools that allow you to check your sites rankings. Some must be paid for; some are free with limitations.

With Rank Tracker, you can schedule automatic checks to occur daily or even weekly. This means that you have constant updates when there are any changes to your sites rankings. If your site had good rankings before, but they then start showing up in the ‘Not in top 50’; this shows that something has happened. This will then need to be actioned quickly to minimise the disruption caused to your sites rankings.

In the image below, you can see what the tool could look like. The columns are customisable which you can choose through an option available in the top right of the platform. The column titled Difference will also say Dropped to show the results had dropped. The image shows a new project which will not have this on there. This tool however does require you to pay for a higher license. If you don’t you will lose the ability to save projects.

Hacking the site

A hacker might not have implementing negative SEO on their mind. However, just from accessing the site through hacking, it can hurt your SEO. Search engines will want their users to be safe and secure when accessing websites through them. This is why they have such high security measures and checks to ensure that the sites people access are safe.

If your site was hacked and Google noticed, it will do either one or both of two things. It will lower your sites rankings and/or add a line of text reading “This site may be hacked” in your search result snippet. If you saw something like what is demonstrated in the image below, would you click on it? Having this on your search results will not only negatively affect your SEO, it will greatly affect your search traffic and quite possibly the conversions through the site.

How do you avoid this?

Improving your sites security should always be at the top of your list. It is the best way to ensure that no one can access the site without the correct details. Often enough sites will have a Content Management System (CMS) that allows people to login to the site to edit the content easily. Anyone with login details will also need to have a secure password and keep it confidential. If anyone was to leave the company, their access will need to be removed.

There are many other articles online that will go into many other methods of making your site more secure and less vulnerable to hacking attacks. An example of an article that will help is ’12 Tips to Protect Your Company Website from Hackers’.

Negative off-page SEO

These are ways people attempt to affect your SEO without internally interfering with it. This means that they won’t edit content or anything like that, but will instead look at other means to affect your site.

Link farms

Overtime, you might get a couple of spammy links which won’t affect your sites rankings. A small number of links are often ignored and can be disavowed. However, when it comes to negative SEO, it is not a case of creating one or two spammy links; many are created. Also known as link farms, large groups of links that have interconnected sites are created and then linked to your site. This creates hundreds to thousands of new links.

The anchor text is quite often the same for each link. The anchors used don’t have to be related to the content on the site. Often enough they aren’t. There are cases where the anchor text contains a keyword to make the link look like it was from the owner of the website. Both are harmful to the site if there are too many of them.

There have been instances where websites have acquired many links through a link farm that ended up causing their rankings to drop for almost every search they were targeting. Thankfully, when it comes to link farms, if you catch it early on, you can disavowal any links that you don’t like the look of and then work towards getting your site ranking again.

How do you avoid this?

Preventing a negative SEO attack via hacking or through off-page methods is out of anyone’s power. However, finding the issue and fixing it will reverse most if not all of the damage. Through the use of another tool, you can check your link profile growth. SEO SpyGlass, another tool by SEO PowerSuite, is an example of a tool that gives you this power.

SEO SpyGlass is a good tool to use as it gives you progress graphs. These will show the number of backlinks on your site and the number of referring domains. These are a great way to spot if there is a sudden change in links. If there is a large spike in any of these graphs, you should be considering where these links were suddenly acquired.

To look further into where the links are coming from, go to the Linking Domains or Backlinks page on the dashboard. Once you reach this page, double click on the head of the column titled Last Found Date. This will then list the links in date order. From here you should then look at when all the links were found and if they were quite close to each other.

There are multiple parts of this tool that will give you more information on each link. The penalty risk is a key part of learning more on new links. It provides you with a value between 0 and 100 that is based on the risk of each link. From this you can find out if the link is coming from the same IP address or C block which can tell you if it is from a link farm.


Scraping is a way of copying content from your site and placing it on another site. This doesn’t necessarily have to be a competitors site either. If there are two or more sites with the same content, Google will only pick one of them to rank. This might not be yours! Google is smart however and often enough does identify where the content originally came from. This is if they don’t find the stolen version first.

This is a very hard thing to avoid as there is no real way to stop people from copying your content. Scrapers will usually act quickly when any new content is created and will repost it straightaway on the duplicate site/sites.

How do you avoid this?

Using tools that evaluate the copy on your website against others is one of the best ways to find if there is any duplication. Copyscape is one of these tools. No matter what way you find it; if you find scraped copies of your content, try and get in contact with the webmaster. If they will not remove the content or you can’t get in touch with them, you should report the scraper using Google’s copyright infringement report.

Forceful crawling

There are instances where site owners want to get their site ranking higher, but they have competitors in their way. Let’s say you were one of those competitors. The site owner tries to crash a competitors site by forcing Google to crawl it multiple times. This will cause a heavy server load. If Googlebot then can’t access the site to be able to crawl it, it will penalise your site.

How do you avoid this?

This is checked through you manually accessing your site. If it is slow or is not able to load, you need to get in contact with someone. Either you hosting company or webmaster will need to be able to tell why there is a long load time. Also there are sources online that will provide some details on what might cause these long load times. For people with more knowledge on server logs, there are detailed instructions on discovering crawlers. From this you can then block them using robots.txt and .htaccess files.

Negative SEO YOU Can Cause

So far we have only talked about negative SEO that other companies can implement on your site. However, these are not the only sources of negative SEO. YOU can do damage to your own SEO without even realising it.

Creating duplicate sites

There are companies out there that believe that having multiple sites is a good thing. If you are part of a company that provides many different products (e.g. window film, garage doors et.), it should all be placed on a single site. If the idea of having multiple sites for the different products come up, remove that idea. Having multiple sites using the same content not only thins out your sites content, but will look very spammy. Google will see all these sites as duplicates and will only rank one of them.

If you have a lot of services that you want to promote, keep it all on one site. From here, you can create a customer journey that takes users to where you want them to go. This can be through pictures, image sliders or through other means.

Extensive duplication of content

On your website, you should have a list of the locations that you will cover. This is standard on almost every site and is to be expected to show up on each page. However, all other content should be unique to each page. If you had a service that was supplied to domestic and commercial properties, but had a separate page on each. The content will need to be unique for each page or one page must redirect to the other. If the content is duplicated, Google will notice this and will penalise your site.

There are ways on stopping this from happening. Ether take the time to write the content out for each page or hire a professional copywriter that will write the content for you.

In conclusion, you should always be checking your site to ensure no negative SEO is being implemented on your site. From links to copied content; you need to ensure that your site is safe from negative SEO that could cause your site to lose its rankings.