When Matt Cutts speaks, the SEO world listens. The head of Google’s Webspam team, Cutts is a respected name in the search engine world, and changes in the world’s largest engine do not pass unnoticed. Cutts recently announced incentives which Google is adopting to stem the slight “uptick” in indexed spam so search results are more relevant. Google is taking aim at low quality and duplicate content in an effort to keep “spammy” sites from ranking highly.
Cutts recently announced that Google had “launched a redesigned document-level classifier that “makes it harder for spammy on-page content to rank highly.” It will work to detect spam, including “spammy” words, such as keyword stuffing. Google has “radically improved” their ability to detect hacked sites, which is a big contributor to spam results. Cutts announced one more change that is aimed at reducing spam: targeting sites with duplicate content or very little original content.
How do the changes affect your site? Even minor changes to Google’s algorithm can produce unpleasant results for sites, including legitimate ones. It is important to keep an eye on SERPs to see if the change is affecting your website. And if it is? It could be that you have duplicated or copied content on your website. This doesn’t necessarily make you a spammer or a plagiarist; it may be that you need to cut back on quotes from other sources and include more of your own original material. It could mean that another site has copied content from you, and Google has mistaken that other site as the originator of the material.
In either case, it is important to look at the content in question and replace it. Even if you are the wronged party and another site has copied your original content, the appeal process is largely nonexistent. A quicker, and ultimately easier, way to handle it is to replace the content with something new and fresh.
Spam is an ongoing issue in the search world, and we can expect to see more changes from Google, as well as other search engines.