Webmasters and content providers began optimizing sites for search engines
In the mid-1990s, as the first search engines were cataloging the nearly Web.
Initially, all webmasters needed to do was to submit the address of a page, or URL,
To the other engines which would send a "spider" too
"Crawl" that page, extract links to other pages from it, and again
Information found on the page to be indexed The process add a search engine
The spider is downloading a page and storing it on the search engine's own server,
Where a second program, known as an indexer, extracts various information about
The page, such as the words it contains and where these are located, as well as
Any weight for specific words, and all links the page contains, which are then
Placed into a scheduler for crawling at a next date.
Site owners started to recognize the value of having their sites better Ranked and show in search engine results, creating a change for both white hats And black hat SEO practitioners. According to industry analyst Danny Sullivan, The phrase "search engine optimization" decently came Into use in 1997 the first documented use of the term Search Engine Optimization was John Audits and his company Multimedia Marketing Group as Documented by a web page from the MMG site since August, 1997.
Early versions of search algorithms relied on webmaster-provided information
Such as the keyword meta tag, or index files in engines like ALIWEB. Meta tags
Provide a guide to all web page's content. Using meta data to index pages was
Found to be less than reliable, whatever, because the webmaster's choice of
Keywords in the meta tag could potentially be an inaccurate representation of
The site's actual content. Inaccurate, incomplete, and inconsistent data in Meta
Tags could and did cause pages to rank for irrelevant searches Web content
Providers also manipulated a number of attributes within the HTML (Hyper Text
Markup Language) source of a page in an attempt to rank well in search engines.
By relying so much on factors such as keyword density which were especially
Within a webmaster's control, early search engines suffered from abuse and
Ranking manipulation. To provide best results to their users, search engines
Had to adapt to certainly their results pages showed the most relevant search
Results, rather than unrelated pages stuffed with numerous keywords by
Unscrupulous webmasters. Since the success and popularity of a search engine are
Determined by its ability to produce the most relevant results to any given
Search, poor quality or irrelevant search results could lead users to find
Other search sources. Search engines responded by developing more complex
Ranking algorithms, taking into account additional factors that were more
Difficult for webmasters to manipulate. Graduate students at Stanford
University, Larry Page and Sergey Brain, improved "Backrub," a search
An engine that relied on a mathematical algorithm to rate the prominence of web
Pages. The number calculated by the algorithm, PageRank, is a
Function of the quantity and strength of inbound links. Page Rank estimates the
The likelihood that a given page will be reached by a web user who randomly surfs
The web, and follows links from one page to another. In effect, this means that
Some links are stronger than others, as a higher Page Rank page is more likely
To be reached by the random surfer.
Page and Brin founded Google in 1998. Google attracted a loyal following
Among the growing number of Internet users, who liked its simple design.
Off-page factors (such as Page Rank and hyperlink analysis) were considered as
Well as on-page factors (such as keyword frequency, meta tags, headings, links
And site structure) to enable Google to avoid the kind of manipulation seen
In search engines that only considered on-page factors for their rankings.
Although Page Rank was more difficult to game, webmasters had already developed
Link building tools and schemes to influence the Inktomi search engine, and these methods proved
Similarly applicable to gaming PageRank. Many sites focused on exchanging,
Buying, and selling links, often on a massive scale. Some of these schemes, or link farms,
Involved in the creation of thousands of sites for the sole purpose of linking
Spamming.
By 2004, search engines had incorporated a wide range of undisclosed factors
In their ranking algorithms to reduce the impact of link manipulation. In June
2007, The New York Times' Saul Hansell stated Google ranks sites using more
Than 200 different signals. The leading search engines, Google, Bing, and Yahoo, do not
Disclose the algorithms they use to rank pages. Some SEO practitioners have
Studied different approaches to search engine optimization, and have shared
Their personal opinions[ Patents related to search engines can
Provide information to better understand search engines.[
In 2005, Google began personalizing search results for each user. Depending
On their history of previous searches, Google crafted results for logged in
Users. In 2008, Bruce Clay said that "ranking is dead" because of personalized search. He opined that it would
Become meaningless to discuss how a website ranked, because its rank would
Potentially be different for each user and each search.
In 2007, Google announced a campaign against paid links that transfer
PageRank.On June 15, 2009, Google disclosed that they had taken measures to
Mitigate the effects of PageRank sculpting by use of the nofollow
An attribute on links. Matt Cutts, a well-known software engineer at Google,
Announced that Google Both would no longer treat nofollowed links in the same
Way, in order to prevent SEO service providers from using nofollow for PageRank
Sculpting.[ As a result of this change the usage of nofollow leads
To evaporation of Pagerank. In order to avoid the above, SEO engineers
Developed alternative techniques that replace nofollowed tags with obfuscated JavaScript
And thus permit PageRank sculpting. Additionally several solutions have been
Suggested that include the use of iframes, Flash and
Javascript.
In December 2009, Google announced it would be using the web search history
Of all its users in order to populate search results.
Google Instant, real-time-search, was introduced in late 2010 in an attempt
To make search results more timely and relevant. Historic site
Administrators have spent months or even years optimizing a website to increase
Search rankings. With the growth in popularity of social media sites and blogs
The leading engines made changes to their algorithms to allow fresh content too
Rank quickly within the search results.In February 2011, Google announced the Panda update, which penalizes
Websites containing content duplicated from other websites and sources.
Historically websites have copied content from one another and benefited from
Search engine rankings by engaging in this practice, however Google implemented
A new system which punishes sites whose content is not unique.
In April 2012, Google launched the Google Penguin update the goal of which
Was to penalize websites that used manipulative techniques to improve their
Rankings on the search engine.
Social Plugin