HISTORY

Webmasters and content providers as the first search engines were cataloging the early Web, began optimizing sites for search engines in the mid-1990. At the beginning, was recognized for its master to all the needed web pages and other content from the page "crawl" to "Spider", and it sends information back to the various engines, the address of a page, or on URL submit must be at index. Process, a page of words and where to download it and a second program, known as an indexer, extracts various information about the page where the search engine's own server, on which are stored in the Spider is a search engine as well as any weight for specific words, and a page containing all the links, then a scheduler for crawling at a later date they are placed.

White Hat and Black Hat SEO practitioners, both site owners an opportunity to create their sites highly ranked and visible in search engine results have started to recognize the value. MMG from the site as a Web page through the document, according to industry analyst Danny Sullivan, the phrase "search engine optimization" is probably the Multimedia Marketing Group on John Audette and his company in 1997.The first recorded use of the term search engine optimization came from August, with the phrase A website is the copyright of the USA Bruce Clay effective March 1997.The first registered in 1997 by it.

Early versions of search algorithms relied on webmaster - supplied in the keyword meta tag, such as machinery, such as ALIWEB, or index data files. Meta Tags provide a guide to each page's content. Found to be less reliable than using the index page meta data, however, because in the keywords meta tag is the webmaster of the site's original content is an inaccurate representation of the selection is more efficient. In the meta tag, incomplete, inaccurate, inconsistent, and inconsistent data that did not rank in search pages is not the reason. Web content providers also attempt to rank well in search engines for a number of attributes manipulated within the HTML source of the page.

The webmaster has been exclusively within the control of such a dependence on factors such as keyword density, by, early search engines suffered from abuse and ranking manipulation. Provide better results for their users, search engines, rather than on their results pages showed the most relevant search results pages with numerous keywords by unscrupulous webmasters to ensure that the receiving Stuffed came. The success and popularity of the results of a search engine to find other search sources and allows users to turn to be false, any search is determined by its ability to produce the most relevant results. It is more difficult for webmasters to modify the search engines into account additional factors, responded by developing more complex ranking algorithms.

 

Stanford University, Larry Page and "backrub," rate the importance of web pages based on a mathematical algorithm to develop a search engine, Sergey Brin, a graduate student. Algorithm, PageRank, calculated by the number of inbound links.PageRank size and strength is a function of a web user surfs the Web page by a random approach to the possibility that the estimates, and the other is a page from the links below. In effect, the higher the PageRank random surfer is more likely to be achieved by some of the links, that are stronger than others.

Page and Brin founded Google in 1998. Google (such as keyword frequency, meta tags, headings, as well as on - page factors are considered to be its general design.Off-page factors (such as PageRank and hyperlink analysis) of Internet users who liked, a loyal following among the growing attention, Google only their Rankings for - the page type, to avoid manipulation of these factors are considered in order to enable search engines, navigation and site structure). Although PageRank was more difficult game, Link Building, Web Master, has already affect the Inktomi search engine development tools, and Schemes, and these methods proved to be applicable to the gaming equivalent to PageRank. A number of sites, the exchange of purchase, and often a collective level, to focus on selling links. These schemes, or link farms are some of the link, the sole purpose of spamming, involving the creation of thousands of sites.

By 2004, their ranking algorithms to reduce the effect of the reverse link in search engines incorporated a wide range of unspecified factors. Google says it ranks sites using more than 200 different signals. Major search engines, Google, Bing, and Yahoo, they use algorithms to rank pages do not disclose. Such as Rand Fishkin, Barry Schwartz, Aaron Wall and Jill vhalen important SEO service providers, a variety of approaches to the study of search engine optimization, and online forums and blogs, publish their opinions. SEO practitioners may also study patents for algorithms that can be conducted to gain insight into the various search engines.

In 2005, Google started personalizing search results for each user. Based on their history of previous searches, Google users can log in and crafted by the results. In 2008, Bruce Clay, because of the personalized search "ranking is Dead" film. Its range is more efficient because it is different for each user and each search, a website to discuss the position is understood.

In 2007, Google transferred PageRank.On June 15, 2009, Google was the nofollow attribute to links through the use of PageRank sculpting effect disclosed that the action was taken against the campaign announced. Matt Cutts, Google is a prominent software engineer, from the SEO service providers to prevent the use of nofollow for PageRank, Google Bot announced that it will no longer be treated as well as links nofollowed sculpting.As this change as a result of evaporation leads to the use of nofollow PageRank. To avoid the above, SEO engineers with obfuscated Javascript, so that PageRank sculpting nofollowed Tags location and allows the development of alternative methods. In addition, a number of solutions for iFrames, Flash, and has been suggested that the use of JavaScript.

In December 2009, Google search results that the population level for all its users over the Web, announced that by using search history.

Real-time - the attempt to change the search results in a more timely and relevant, introduced in late 2009. Historically, the site administrator spent months or years of optimizing a website to increase search rankings. Major engines of growth in the popularity of social media sites and blogs with fresh content for their algorithms to rank search results as soon as changes are made within