Webmasters began mid-1990s in order to optimize websites for search engines, as the first search engines thus began the early Web cataloging. In the early days, it was enough, the URL to send the relevant page of the various search engines, then a web crawlers sent out to analyze the page and this indexed. [1] The crawler invited the website on the server of the search engine, where a second program , the so-called indexers , information herauslas and cataloged (called words, links to other sites). The site owners realized quite quickly the value of a preferred listing in the seo expert london search results of search engines and soon to companies specialized in this technique developed. The early versions of search algorithms based on information that has been given by the webmaster yourself how meta elements , or index files in engines like ALIWEB . Meta elements provide an overview of the contents of a page, but it soon turned out that the use of this evidence was not reliable, since the choice of the key words used could reflect by the webmaster an inaccurate representation of the page content. Inaccurate and incomplete data in the meta-elements were so irrelevant pages list for specific searches. [2] also tried Page Creator different attributes within the HTML codes one side to manipulate so that the page is better listed in the search results. [3]
Since the early search engines were very dependent on factors which were solely in the hands of the webmaster, they were also very vulnerable to abuse and manipulation in the ranking. In order to obtain better and more relevant results in the search results, the operator of the search engines had to adapt to these realities. Because of the success of a search engine depends on display relevant search results on the detected keywords, inappropriate results could lead to look around the user for other ways to search the Web. The response of the search engines existed in more complex algorithms for ranking the factors involved that were of webmasters are difficult or impossible to influence. Larry Page and Sergey Brin developed with " Backrub " a search engine that was based on a mathematical algorithm of the prominence Reviewed websites. The number that the algorithm ( PageRank ) was calculated to a page that is a function of the quantity and strength of backlinks . [4] PageRank is likely again that a user lands on a specific web page when they are surfing the web and randomly clicking links.
Page and Brin founded Google in 1998, the simple design helped the search engine to a huge success. [5] [6] Google used off-page factors (like PageRank and hyperlink analysis) as well as on-page factors (frequency of keywords, metadata , page structure, etc.) for the ranking. The search engine would handle it as manipulations in those search engines that used only the on-page factors. Although PageRank was more difficult to influence, webmasters had already strategies and tools for link building developed, which allowed them, the ranking of the search engine Inktomi influence. These methods have proven to be successful for PageRank. Many sites put their focus on the buying, selling and exchanging links, often on a large scale. These link farms created often thousands of pages with the sole purpose of link spamming to operate. [7]
From 2004, translated by searching a a wide range of undisclosed factors in their ranking algorithms to reduce the influence of link manipulation. 2007 was using Google more than 200 different signals to determine the list of specific sites in the search results. [8] The leading search engines Google, Bing and Yahoo did not give the algorithms for ranking award. Some SEOs studied these different approaches to search engine optimization and shared their findings. [9] Even patents regarding search engines were for obtaining information on the function used. [10]
2005 Google began the search results based on past searches registered user to personalize. [11] Bruce Clay then declared in 2008 that ranking because of personalized search has no meaning and the discussion of manipulations in the ranking of search engines are obsolete, as the same searches on would provide different users potentially different results. [12]
2007 Google announced a campaign against paid links to exercising influence PageRank. [13] 2009 informed Google with that action has been taken to the effects of the so-called PageRank Sculptings (a method to certain links on a page more weight lend and the link target by up to list in Google) by the " nofollow " attribute on links restrict. This attribute, originally introduced by Google, was no longer considered by PageRank, to mitigate the effects of PageRank Sculptings by SEOs. [14] In response soon other techniques were used, the replaced nofollow tags with hidden Java code, which PageRank Sculpting was possible. Other solutions, like the use of iframes or Flash have been proposed. [15]
2009 gave Google known to refer to the search history of all users for popular search results. [16] In 2010, Google the new Index System Google Caffeine before. It allowed users to search for news, community contributions and OTHERWISE content shortly after the publication date. Google Caffeine represented a change is in the way, such as Google updates its index. [17]
The real-time search Google Instant was introduced in 2010 to allow search results appear timely and relevant. The growing popularity of social networks and blogs, was the born with the real-time search, and for enhanced focused on. [18]
By Panda upgrade 2011 finally websites the page contents have been punished, copied from other sites. This technique was sometimes applied to be listed higher in search results. [19] With Google Penguin 2012, pages are punished, apply the manipulative techniques to improve the rankings.