Google to Strengthen Rules on Spammers and Content Farms in 2011

google-search-homepageWith about 66-percent market share of the total U.S. online search market, Google is the largest online search engine in the world.

Google has always been the target of marketers and spammers who create low quality and highly unreliable content across thousands of topics for the sole purpose of generating revenue with CPC advertisements by manipulating content to position articles high up in search results.

 Because Google is the top search engine, it is facing a unique challenge with excess amounts of spam compared to other search engines that also face the same challenges.

Google admits there was an increase in spam in its index over the last few months, but the company also says it has launched new tools to mitigate the problem, and it continues to develop new systems to continuously improve search relevance.

Google has greatly improved how it ranks web pages included in its index to increase the quality of search results with more reliable content by penalizing spammy websites and other poor quality sources.

The company says its new classifier system better detects spam by analyzing on-page content to detect repeatedly used known spammy keywords, in turn effectively stopping those pages from ever ranking highly in search results.

As the offensive against spammers intensifies, the spammers are using more advanced methods to circumvent emerging anti-spam systems.

Spammers are now using more advanced automated scripts that try to find exploits in websites in order to inject malicious code in system files to up their own rankings by targeting existing established websites that have a high Google PageRank.

These attacks largely go unnoticed by webmasters, even after a successful SQL-injection that covertly changes website content to spam content which is visible only to Google robots and spiders.

Google says it has “radically improved” its ability to detect hacked websites, which it says hacked websites were a significant source of spam in 2010.

The web giant says it is now shifting its attention to content farms, which are websites with large quantities of low quality articles.

Essentially, those websites write content on typical search terms, including low keywords with low competition, to drive traffic and revenue from ad clicks.

In an aim to give more credit to original content creators and to improve relevancy, Google also said on Friday that it was in the process of evaluating various changes that would lower rankings of websites that copy original content, and even websites that include some original content but more copied content.

People question why these marketers/businesses are involved in this business, and the truth is simply that there actually is a lot at stake.

Take for example, Demand Media, a firm that creates thousands of articles per month tailored for search engines, rather than creating inspired value-adding content.

That content, which is already search engine optimized, is then promoted on social media websites, in turn pushing its rankings higher up in search engines.

Demand Media, which owns popular web properties like and, publishes about 4,000 articles on a daily basis (part automation and part human editorial) and has received $375-million in total funding, according to the company database website CrunchBase. It is also one of the top 30 most popular websites in the United States, according to the Internet analytics firm comScore.

The problem is that this content is usually very doubtful, of low quality, and highly unreliable, and its only purpose is to get readers to click on affiliate or other CPC advertisements.

Google says in 2010 it made two significant changes (specific details were not publicly disclosed) to its algorithmic systems aimed specifically at content farms, in the process penalizing and even banning many of those sites from its index.

Google maintains the company does not use human intervention in ranking search results, but perhaps, the company should re-consider this policy to selectively remove low quality sites entirely from its index.

As more spam infiltrates search engines, top publications realize people simply want quality content that is timely, relevant, reliable, free from material errors, and trustworthy.

Some news publications have adopted subscription-based models, even blocking Google from indexing their content.

Those publications argue that people want trustworthy information, and that they simply don’t need to go to search engines to find that content, rather, they could go directly to their publication.

If Google chooses not to increase human intervention in determining search rankings, the company should at least introduce new query parameters that could be used by people to filter results by showing only top websites, among other possible parameters.

Google needs to find a way to balance the reliability of content sources and at the same time to preserve equity in the marketplace when it comes to sending traffic to specific websites.

The new incoming Google chief executive, Google co-founder Larry Page, now 38, will tackle the difficult challenge of not only improving search results, but to evolve and to take the company to the next level as it faces intensifying competition from rivals like Microsoft Bing, and Facebook, among others.

tag TAGS: ,
Short URL:
b2p Ensure that you follow us on Twitter and Like us on Facebook
Hercules holds a B.Comm Finance from Ryerson University in Toronto, Canada. He is a Chartered Financial Analyst (CFA) level 3 candidate. He was previously a contributor at FiLife, a finance website owned by Dow Jones and IAC. Write to [email protected]
We are perfectly committed to the highest ethical and professional codes of conduct and standards in the industry on a firm wide basis. Learn more about us, our contributors, and our governance
We encourage you to comment. Comments are moderated. Comments that are abusive, off-topic, have marginal substance, or include promotional content will be removed. We cannot facilitate requests to edit or remove comments, or explain moderation decisions
  • Justwannabefree

    Google to censor search terms?? Interesting eh?? Looks like and sounds like, is a fascist dictatorship to me. However Google’s programme slots in nicely with Senator Jay Rockefeller’s Cyber security Bill! Rockefeller is known to have lamented the invention of the internet, wonder why??

    • Tizvini

      how is Google going to censor search terms? not understanding where you got that info


    A long time coming. Don’t throw the baby out with the bath water. Don’t get personal with the algorithms.Local search services like and Groupon Daily Deals are expanding search options for consumers.Google has also turned up the heat on companies like Yext!

  • Anonymous

    To some extent I do agree with David Ogletree’s comments; however I’m sure Google has a hard time keeping ahead of all those who would seek to abuse the system. As alway the “law” only effects the “law abiding”.

  • InMktgWeTrust

    It’s about time indeed for Google to fight Spammers. Most #SEO content created on the market is spinned content absolutely out of date. Although as competition get tougher, SEO gets more and more expensive.

  • Ash Singh

    its good to know that google is trying to protect websites from spammer, I have heard that Google will allow owner of website to blacklist spam website manually. But not too sure, it’ll give users a great privilage, but along with loads of disadvantage….

    Ash Singh

Business 2.0 Press publishes exclusive business tech news and analysis covering start-ups to large-caps from Bay & Wall streets since 2008 from a group of highly knowledgeable industry professionals that abide by the toughest industry codes of conduct and professional standards lightMore

lightAdd value by subscribing (RSS)

logo has the most stock ratios for public companies. Get the most comprehensive micro insight on public firms available on the web, all for free.
Stock Fractionsgo


Colon cancer is one of the leading causes of death. Irrespective of family history, everyone is exposed to the risk. About 90% of colon cancer cases begin from non-cancerous tumors, polyps, which could form in the large bowel. Screening with a colonoscopy will painlessly remove any polyps hence almost entirely reducing your risk of developing the horrible disease. The good news is that about 90% of colon cancer cases are preventable through a simple (yes, simple) colonoscopy.
Learn moreatom
Public service message from Business 2.0 Press