Google’s Fight against Web-SPAM and Lessons that You Can Learn from It
Google recently released their annual report detailing how they police the Internet in the last one year, or rather, the portion of the Internet that makes it to search results. Much of it is a self-pat on the back, and much as there are some valid complaints against Google’s penalties, it is important to know exactly what Google qualifies as a bad quality internet experience so that you don’t get caught.
Bear in mind that for many users, Google is the internet or at the very least their doorway into the Internet. However, remember also that where rubber meets the road, Google is a product, whose survival depends on their ability to make money. They have to do this by ensuring that users of the product derive the best possible experience from them, or else they will go elsewhere.
As such, Google must make sure that the results it provides on its search engine results pages (SERPs) are as high-quality and relevant to user queries as can be. This is why they are constantly tweaking their algorithms and have employed people to carry out manual reviews to weed out crappy sites that have thin, manipulative, harmful and/or stolen content from their indexes.
Below we describe how Google does its job, and what you can do to stay on their good side.
Google report onweb spam
According to Google’s report, they implemented an algorithmic update to get rid of web spam from their SERPs, a move that affected 5 percent of queries. The remaining amount of web-spam was attacked manually by Google employees, who sent over 4.3 million messages to webmasters whose sites had attracted manual penalties for spamming following the review.
Through this move, Google saw a third of the penalized sites works to clean out their spam in a bid for reconsideration by the search engines. It remains unclear whether the remaining two thirds have been booted from the Internet or are still appealing their penalties.
Google users globally in the last year submitted over 400,000 messages on spamming to Google manually. Of these, Google acted on 65 percent of reports, considering 80 percent of those that drew action to have been spammed.
There was a whopping 180 percent rise in the number of websites that were hacked in 2015 compared to 2014. Hacking presents in a number of ways, among them malware and website spamming attacks, both with similar result. A hacked site will be removed or flagged and placed ‘in quarantine’, as described here. Google has released a list of official guidelines to help you sidestep hackers, including:
- Improving account security by using lengthy and difficult passwords, and making each password unique for every internet platform accessed
- Updating site software on a regular basis, particularly the content management system and related plugins
- Researching on web hosts’ security policies, including response to security concerns and cleaning up of hacked sites e.g. do they provide live support should your website fall under attack?
- Using monitoring tools to identify potentially hacked site content. This begins by signing up with Google Search Console, since it’s how Google can let you know about problems on your site.
Publishing thin, low-quality content
In the last year, Google saw a rise in number of websites with low-quality, thin content, and a considerable proportion of these was likely to be offered using scraper sites.
Sadly, there is little recourse for you if your site is being scraped, particularly since Google discontinued their reporting tool and is of the mind that scrapping is a site’s own fault. You simply need to be confident in your website’s architecture and authority, and hope that the remaining content will help you get a better rank that the scraper site.
If you have attracted a Google manual penalty citing “thin content with little to no added value”, there are steps you can take to improve your position. These are listed below, but can all be summarized as “Don’t publish crappy content, duh!”
- The first step is to check your site as follows:
- Auto-generated content – this is a bad idea. Auto-content reads as though it was written by a machine, which it probably was
- Thin content containing affiliate links – affiliate links are perfectly fine in high-quality articles, but for pages in which affiliates have reviews or descriptions directly listed from original retailers with little or no unique content added are bad. The rule of thumb is to have a very small part of content on your site accounted for by affiliate content.
- Scraped content – if your site automatically lifts and republished entire articles published in other websites without their permission, then you have this one coming, and you know what to do
- Doorway webpages – these are webpages that seem different and can show up multiple times in SERPs for a specific query, yet they all lead users to the same page. They are frequently used for rankings manipulation.
- Throw them all out
- If after doing so you’re positive that you site has some value to offer users, resubmit to Google for reconsideration by going to this page.
Webmasters can use the Webmaster help forum for support, which brings together more than 35,000 users designated as Webmaster top contributors. Webmasters have been using the forum to get their questions and concerns addressed, and you’re most likely to find what you’re looking for there. As the saying goes, there’s nothing new under the sun: chances are that someone else has gone through the exact same issue you may be facing.
That being said, Google has stated their continued commitment to improving web spam-fighting technologies and collaborating with both users and webmasters to support a high-quality Internet environment. Fighting spam is one of the main ways Google uses to maintain top search quality for its clients, ensuring that users enjoy wholesome web experience each time they visit Google search.
Now that you know what to do and not do, go forth and do not publish crappy content. Instead of fighting fires, staying on Google’s good side is quite simple: focus on unique, high-quality, relevant, in-depth and engaging content on every single one of your pages. Now that’s not so hard, is it?
Thomas Mark is an expert white label SEO consultant who has worked with thousands of sites within his twenty-plus years in the field of SEO. He is a top contributor to many sites over the Internet.