Search engine optimization
Encyclopedia
Search engine optimization (SEO) is the process of improving the visibility of a website
Website
A website, also written as Web site, web site, or simply site, is a collection of related web pages containing images, videos or other digital assets. A website is hosted on at least one web server, accessible via a network such as the Internet or a private local area network through an Internet...

 or a web page
Web page
A web page or webpage is a document or information resource that is suitable for the World Wide Web and can be accessed through a web browser and displayed on a monitor or mobile device. This information is usually in HTML or XHTML format, and may provide navigation to other web pages via hypertext...

 in search engine
Search engine
A search engine is an information retrieval system designed to help find information stored on a computer system. The search results are usually presented in a list and are commonly called hits. Search engines help to minimize the time required to find information and the amount of information...

s via the "natural" or un-paid ("organic
Organic search
Organic search results are listings on search engine results pages that appear because of their relevance to the search terms, as opposed to their being advertisements. In contrast, non-organic search results may include pay per click advertising....

" or "algorithmic") search results
Search engine results page
A search engine results page , is the listing of web pages returned by a search engine in response to a keyword query. The results normally include a list of web pages with titles, a link to the page, and a short description showing where the Keywords have matched content within the page...

. In general, the earlier (or higher ranked on the search results page), and more frequently a site appears in the search results list, the more visitors it will receive from the search engine's users. SEO may target different kinds of search, including image search
Image search
Image meta search is a type of search engine specialised on finding pictures, images, animations etc. Like the text search, image search is an information retrieval system designed to help to find information on the Internet and it allows the user to look for images etc...

, local search
Local search (Internet)
Local search is the use of specialized Internet search engines that allow users to submit geographically constrained searches against a structured database of local business listings...

, video search, academic search, news search and industry-specific vertical search
Vertical search
A vertical search engine, as distinct from a general web search engine, focuses on a specific segment of online content. The vertical content area may be based on topicality, media type, or genre of content. Common verticals include shopping, the automotive industry, legal information, medical...

 engines.

As an Internet marketing
Internet marketing
Internet marketing, also known as digital marketing, web marketing, online marketing, search marketing or e-marketing, is referred to as the marketing of products or services over the Internet...

 strategy, SEO considers how search engines work, what people search for, the actual search terms typed into search engines and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content and HTML
HTML
HyperText Markup Language is the predominant markup language for web pages. HTML elements are the basic building-blocks of webpages....

 and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities
Web crawler
A Web crawler is a computer program that browses the World Wide Web in a methodical, automated manner or in an orderly fashion. Other terms for Web crawlers are ants, automatic indexers, bots, Web spiders, Web robots, or—especially in the FOAF community—Web scutters.This process is called Web...

 of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic.

The acronym "SEOs" can refer to "search engine optimizers," a term adopted by an industry of consultants who carry out optimization projects on behalf of clients, and by employees who perform SEO services in-house. Search engine optimizers may offer SEO as a stand-alone service or as a part of a broader marketing campaign. Because effective SEO may require changes to the HTML
HTML
HyperText Markup Language is the predominant markup language for web pages. HTML elements are the basic building-blocks of webpages....

 source code of a site and site content, SEO tactics may be incorporated into website
Website
A website, also written as Web site, web site, or simply site, is a collection of related web pages containing images, videos or other digital assets. A website is hosted on at least one web server, accessible via a network such as the Internet or a private local area network through an Internet...

 development and design. The term "search engine friendly" may be used to describe website designs, menus
Menu (computing)
In computing and telecommunications, a menu is a list of commands presented to an operator by a computer or communications system. A menu is used in contrast to a command-line interface, where instructions to the computer are given in the form of commands .Choices given from a menu may be selected...

, content management systems, images, videos, shopping carts
Shopping cart software
Shopping cart software is software used in e-commerce to assist people making purchases online, analogous to the American English term 'shopping cart'...

, and other elements that have been optimized for the purpose of search engine exposure.

History

Webmaster
Webmaster
A webmaster , also called a web architect, web developer, site author, or website administrator is a person responsible for maintaining one or many websites...

s and content providers began optimizing sites for search engines in the mid-1990s, as the first search engines were cataloging the early Web
World Wide Web
The World Wide Web is a system of interlinked hypertext documents accessed via the Internet...

. Initially, all webmasters needed to do was submit the address of a page, or URL
Uniform Resource Locator
In computing, a uniform resource locator or universal resource locator is a specific character string that constitutes a reference to an Internet resource....

, to the various engines which would send a "spider
Web crawler
A Web crawler is a computer program that browses the World Wide Web in a methodical, automated manner or in an orderly fashion. Other terms for Web crawlers are ants, automatic indexers, bots, Web spiders, Web robots, or—especially in the FOAF community—Web scutters.This process is called Web...

" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed
Index (search engine)
Search engine indexing collects, parses, and stores data to facilitate fast and accurate information retrieval. Index design incorporates interdisciplinary concepts from linguistics, cognitive psychology, mathematics, informatics, physics, and computer science...

. The process involves a search engine spider downloading a page and storing it on the search engine's own server, where a second program, known as an indexer, extracts various information about the page, such as the words it contains and where these are located, as well as any weight for specific words, and all links the page contains, which are then placed into a scheduler for crawling at a later date.

Site owners started to recognize the value of having their sites highly ranked and visible in search engine results, creating an opportunity for both white hat
White hat
The term "white hat" in Internet slang refers to an ethical hacker, or a computer security expert, who specializes in penetration testing and in other testing methodologies to ensure the security of an organization's information systems...

 and black hat
Black hat
A black hat is the villain or bad guy, especially in a western movie in which such a character would stereotypically wear a black hat in contrast to the hero's white hat, especially in black and white movies....

 SEO practitioners. According to industry analyst Danny Sullivan
Danny Sullivan (technologist)
Danny Sullivan is the editor-in-chief of Search Engine Land, a blog that covers news and information about search engines, and search marketing.Search Engine Land is owned by Third Door Media, of which Danny Sullivan is partner and chief content officer...

, the phrase "search engine optimization" probably came into use in 1997. The first documented use of the term Search Engine Optimization was John Audette and his company Multimedia Marketing Group as documented by a web page from the MMG site from August, 1997.

Early versions of search algorithm
Algorithm
In mathematics and computer science, an algorithm is an effective method expressed as a finite list of well-defined instructions for calculating a function. Algorithms are used for calculation, data processing, and automated reasoning...

s relied on webmaster-provided information such as the keyword meta tag, or index files in engines like ALIWEB
Aliweb
ALIWEB is considered the first Web search engine, as its predecessors were either built with different purposes or were literally just indexers ....

. Meta tags provide a guide to each page's content. Using meta data to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches. Web content providers also manipulated a number of attributes within the HTML source of a page in an attempt to rank well in search engines.

By relying so much on factors such as keyword density
Keyword density
Keyword density is the percentage of times a keyword or phrase appears on a web page compared to the total number of words on the page. In the context of search engine optimization keyword density can be used as a factor in determining whether a web page is relevant to a specified keyword or...

 which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, allowing those results to be false would turn users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.

Graduate students at Stanford University
Stanford University
The Leland Stanford Junior University, commonly referred to as Stanford University or Stanford, is a private research university on an campus located near Palo Alto, California. It is situated in the northwestern Santa Clara Valley on the San Francisco Peninsula, approximately northwest of San...

, Larry Page
Larry Page
Lawrence "Larry" Page is an American computer scientist and internet entrepreneur who, with Sergey Brin, is best known as the co-founder of Google. As of April 4, 2011, he is also the chief executive of Google, as announced on January 20, 2011...

 and Sergey Brin
Sergey Brin
Sergey Mikhaylovich Brin is a Russian-born American computer scientist and internet entrepreneur who, with Larry Page, co-founded Google, one of the largest internet companies. , his personal wealth is estimated to be $16.7 billion....

, developed "backrub," a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank
PageRank
PageRank is a link analysis algorithm, named after Larry Page and used by the Google Internet search engine, that assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of "measuring" its relative importance within the set...

, is a function of the quantity and strength of inbound links. PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random surfer.

Page and Brin founded Google
Google
Google Inc. is an American multinational public corporation invested in Internet search, cloud computing, and advertising technologies. Google hosts and develops a number of Internet-based services and products, and generates profit primarily from advertising through its AdWords program...

 in 1998. Google attracted a loyal following among the growing number of Internet users, who liked its simple design. Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi
Inktomi
Inktomi Corporation was a California company that provided software for Internet service providers. It was founded in 1996 by UC Berkeley professor Eric Brewer and graduate student Paul Gauthier. The company was initially founded based on the real-world success of the search engine they developed...

 search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farm
Link farm
On the World Wide Web, a link farm is any group of web sites that all hyperlink to every other site in the group. Although some link farms can be created by hand, most are created through automated programs and services. A link farm is a form of spamming the index of a search engine...

s, involved the creation of thousands of sites for the sole purpose of link spamming
Spamdexing
In computing, spamdexing is the deliberate manipulation of search engine indexes...

.

By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. Google says it ranks sites using more than 200 different signals. The leading search engines, Google
Google
Google Inc. is an American multinational public corporation invested in Internet search, cloud computing, and advertising technologies. Google hosts and develops a number of Internet-based services and products, and generates profit primarily from advertising through its AdWords program...

, Bing
Bing
Bing is a web search engine from Microsoft.Bing may also refer to:* An onomatopœia of a bell sound* Bing cherry, a variety of cherry* Bing , Chinese flatbread* Bing , a German company that manufactured toys and kitchen utensils...

, and Yahoo, do not disclose the algorithms they use to rank pages. SEO service providers, such as Rand Fishkin, Barry Schwartz
Barry Schwartz (technologist)
Barry Schwartz is a blogger who writes about search engines and search engine marketing. Schwartz is currently the editor of Search Engine Roundtable, an online news site covering the search engines and search engine marketing...

, Aaron Wall and Jill Whalen
Jill Whalen
Jill Whalen is a search engine optimization consultant, speaker and writer. Whalen is CEO of High Rankings, and co-founder of Search Engine Marketing New England . She is a regular speaker at Search Engine Strategies Conferences...

, have studied different approaches to search engine optimization, and have published their opinions in online forums
Internet forum
An Internet forum, or message board, is an online discussion site where people can hold conversations in the form of posted messages. They differ from chat rooms in that messages are at least temporarily archived...

 and blog
Blog
A blog is a type of website or part of a website supposed to be updated with new content from time to time. Blogs are usually maintained by an individual with regular entries of commentary, descriptions of events, or other material such as graphics or video. Entries are commonly displayed in...

s. SEO practitioners may also study patents held by various search engines to gain insight into the algorithms.

In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users. In 2008, Bruce Clay said that "ranking is dead" because of personalized search
Personalized search
Personalized search refers to search experiences that are tailored specifically to an individual's interests by incorporating information about the individual beyond specific query provided. Pitkow et al...

. It would become meaningless to discuss how a website ranked, because its rank would potentially be different for each user and each search.

In 2007, Google announced a campaign against paid links that transfer PageRank. On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow
Nofollow
nofollow is a value that can be assigned to the rel attribute of an HTML a element to instruct some search engines that a hyperlink should not influence the link target's ranking in the search engine's index...

 attribute on links. Matt Cutts
Matt Cutts
Matt Cutts works for the Search Quality group in Google, specializing in search engine optimization issues. In an interview with USA Today in June 2008, Cutts provided advice on how to optimize search results on Google.-Career:...

, a well-known software engineer at Google, announced that Google Bot would no longer treat nofollowed links in the same way, in order to prevent SEO service providers from using nofollow for PageRank sculpting. As a result of this change the usage of nofollow leads to evaporation of pagerank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated Javascript
JavaScript
JavaScript is a prototype-based scripting language that is dynamic, weakly typed and has first-class functions. It is a multi-paradigm language, supporting object-oriented, imperative, and functional programming styles....

 and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframe
IFrame
iFrame can be:* I-frames, in video compression; see video compression picture types* iFrame * The HTML iframe element....

s, Flash and Javascript.
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.

Google Instant, real-time-search, was introduced in late 2009 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.

Relationship with search engines

By 1997, search engines recognized that webmaster
Webmaster
A webmaster , also called a web architect, web developer, site author, or website administrator is a person responsible for maintaining one or many websites...

s were making efforts to rank well in their search engines, and that some webmasters were even manipulating their rankings
Spamdexing
In computing, spamdexing is the deliberate manipulation of search engine indexes...

 in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista
AltaVista
AltaVista is a web search engine owned by Yahoo!. AltaVista was once one of the most popular search engines but its popularity declined with the rise of Google...

 and Infoseek
Infoseek
Infoseek was a popular search engine founded in 1994 by Steve Kirsch.Infoseek was originally operated by the Infoseek Corporation, headquartered in Sunnyvale, California. Infoseek was bought by The Walt Disney Company in 1998, and the technology was merged with that of the Disney-acquired Starwave...

, adjusted their algorithms in an effort to prevent webmasters from manipulating rankings.

Due to the high marketing value of targeted search results, there is potential for an adversarial relationship between search engines and SEO service providers. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web, was created to discuss and minimize the damaging effects of aggressive web content providers.

Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power
Traffic Power
Traffic Power was a Las Vegas, Nevada search engine optimization company that engaged in black hat techniques. These were spamdexing practices that violated Google's webmaster guidelines...

, which allegedly used high-risk techniques and failed to disclose those risks to its clients. Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban. Google's Matt Cutts
Matt Cutts
Matt Cutts works for the Search Quality group in Google, specializing in search engine optimization issues. In an interview with USA Today in June 2008, Cutts provided advice on how to optimize search results on Google.-Career:...

 later confirmed that Google did in fact ban Traffic Power and some of its clients.

Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, chats, and seminars. Major search engines provide information and guidelines to help with site optimization. Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website. Bing Toolbox provides a way from webmasters to submit a sitemap and web feeds, allowing users to determine the crawl rate, and how many pages have been indexed by their search engine.

Getting indexed

The leading search engines, such as Google
Google
Google Inc. is an American multinational public corporation invested in Internet search, cloud computing, and advertising technologies. Google hosts and develops a number of Internet-based services and products, and generates profit primarily from advertising through its AdWords program...

, Bing
Bing
Bing is a web search engine from Microsoft.Bing may also refer to:* An onomatopœia of a bell sound* Bing cherry, a variety of cherry* Bing , Chinese flatbread* Bing , a German company that manufactured toys and kitchen utensils...

 and Yahoo!
Yahoo!
Yahoo! Inc. is an American multinational internet corporation headquartered in Sunnyvale, California, United States. The company is perhaps best known for its web portal, search engine , Yahoo! Directory, Yahoo! Mail, Yahoo! News, Yahoo! Groups, Yahoo! Answers, advertising, online mapping ,...

, use crawlers
Web crawler
A Web crawler is a computer program that browses the World Wide Web in a methodical, automated manner or in an orderly fashion. Other terms for Web crawlers are ants, automatic indexers, bots, Web spiders, Web robots, or—especially in the FOAF community—Web scutters.This process is called Web...

 to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. Some search engines, notably Yahoo!, operate a paid submission service that guarantee crawling for either a set fee or cost per click
Pay per click
Pay per click is an Internet advertising model used to direct traffic to websites, where advertisers pay the publisher when the ad is clicked. With search engines, advertisers typically bid on keyword phrases relevant to their target market...

. Such programs usually guarantee inclusion in the database, but do not guarantee specific ranking within the search results. Two major directories, the Yahoo Directory and the Open Directory Project
Open Directory Project
The Open Directory Project , also known as Dmoz , is a multilingual open content directory of World Wide Web links. It is owned by Netscape but it is constructed and maintained by a community of volunteer editors.ODP uses a hierarchical ontology scheme for organizing site listings...

 both require manual submission and human editorial review. Google offers Google Webmaster Tools
Google Webmaster Tools
Google Webmaster Tools is a no-charge web service by Google for webmasters. It allows webmasters to check indexing status and optimize visibility of their websites.It has tools that let the webmasters:* Submit and check a sitemap...

, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that aren't discoverable by automatically following links.

Search engine
Web search engine
A web search engine is designed to search for information on the World Wide Web and FTP servers. The search results are generally presented in a list of results often referred to as SERPS, or "search engine results pages". The information may consist of web pages, images, information and other...

 crawlers may look at a number of different factors when crawling
Web crawler
A Web crawler is a computer program that browses the World Wide Web in a methodical, automated manner or in an orderly fashion. Other terms for Web crawlers are ants, automatic indexers, bots, Web spiders, Web robots, or—especially in the FOAF community—Web scutters.This process is called Web...

 a site. Not every page is indexed by the search engines. Distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled. Additionally, search engines sometimes have problems with crawling sites with certain kinds of graphic content, flash files, portable document format files, and dynamic content.

Preventing crawling

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory
Root directory
In computer file systems, the root directory is the first or top-most directory in a hierarchy. It can be likened to the root of a tree — the starting point where all branches originate.-Metaphor:...

 is the first file crawled. The robots.txt file is then parsed, and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.

Increasing prominence

A variety of methods can increase the prominence of a webpage within the search results. Cross linking
Methods of website linking
This article pertains to methods of hyperlinking to/of different websites, often used in regard to search engine optimization . Many techniques and special terminology about linking are described below.-Reciprocal link:...

 between pages of the same website to provide more links to most important pages may improve its visibility.
Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic. Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's meta data, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL normalization
URL normalization
URL normalization is the process by which URLs are modified and standardized in a consistent manner. The goal of the normalization process is to transform a URL into a normalized or canonical URL so it is possible to determine if two syntactically different URLs may be equivalent.Search engines...

 of web pages accessible via multiple urls, using the "canonical" meta tag or via 301 redirects can help make sure links to different versions of the url all count towards the page's link popularity score.

Image search optimization

Image search optimization is the process of organizing the content of a webpage to increase relevance to a specific keyword on image search engines. Like search engine optimization
Search engine optimization
Search engine optimization is the process of improving the visibility of a website or a web page in search engines via the "natural" or un-paid search results...

, the aim is to achieve a higher organic search
Organic search
Organic search results are listings on search engine results pages that appear because of their relevance to the search terms, as opposed to their being advertisements. In contrast, non-organic search results may include pay per click advertising....

 listing and thus increasing the volume of traffic from search engines.

Image search optimization techniques can be viewed as a subset of search engine optimization techniques that focuses on gaining high ranks on image search engine results.

Unlike normal SEO process, there isn't much to do for ISO. Making high quality images accessible to search engines and providing some description about images is almost all that can be done for ISO.

White hat versus black hat

SEO techniques can be classified into two broad categories: techniques that search engines recommend as part of good design, and those techniques of which search engines do not approve. The search engines attempt to minimize the effect of the latter, among them spamdexing
Spamdexing
In computing, spamdexing is the deliberate manipulation of search engine indexes...

. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat
White hat
The term "white hat" in Internet slang refers to an ethical hacker, or a computer security expert, who specializes in penetration testing and in other testing methodologies to ensure the security of an organization's information systems...

 SEO, or black hat
Black hat
A black hat is the villain or bad guy, especially in a western movie in which such a character would stereotypically wear a black hat in contrast to the hero's white hat, especially in black and white movies....

 SEO. White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.

An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the spiders, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility, although the two are not identical.

Black hat SEO
Spamdexing
In computing, spamdexing is the deliberate manipulation of search engine indexes...

 attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses text that is hidden, either as text colored similar to the background, in an invisible div
Span and div
In HTML, the span and div elements are used where parts of a document cannot be semantically described by other HTML elements.Most HTML elements carry semantic meaning – i.e. the element describes, and can be made to function according to, the type of data contained within...

, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking
Cloaking
Cloaking is a search engine optimization technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page...

.

Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One infamous example was the February 2006 Google removal of both BMW
BMW
Bayerische Motoren Werke AG is a German automobile, motorcycle and engine manufacturing company founded in 1916. It also owns and produces the Mini marque, and is the parent company of Rolls-Royce Motor Cars. BMW produces motorcycles under BMW Motorrad and Husqvarna brands...

 Germany and Ricoh
Ricoh
or Ricoh, is a Japanese company that was established in 1936 on February 6th, as , a company in the RIKEN zaibatsu. Its headquarters is located in Ricoh Building in Chūō, Tokyo....

 Germany for use of deceptive practices. Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.

As a marketing strategy

SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, depending on the site operator's goals. A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics
Web analytics
Web analytics is the measurement, collection, analysis and reporting of internet data for purposes of understanding and optimizing web usage....

 programs to enable site owners to measure results, and improving a site's conversion rate
Conversion rate
In internet marketing, conversion rate is the ratio of visitors who convert casual content views or website visits into desired actions based on subtle or direct requests from marketers, advertisers, and content creators...

.

SEO may generate an adequate return on investment
Return on investment
Return on investment is one way of considering profits in relation to capital invested. Return on assets , return on net assets , return on capital and return on invested capital are similar measures with variations on how “investment” is defined.Marketing not only influences net profits but also...

. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors. It is considered wise business practice for website operators to liberate themselves from dependence on search engine traffic. Seomoz.org has suggested that "search marketers, in a twist of irony, receive a very small share of their traffic from search engines." Instead, their main sources of traffic are links from other websites.

International markets

Optimization techniques are highly tuned to the dominant search engines in the target market.
The search engines' market shares vary from market to market, as does competition.
In 2003, Danny Sullivan
Danny Sullivan (technologist)
Danny Sullivan is the editor-in-chief of Search Engine Land, a blog that covers news and information about search engines, and search marketing.Search Engine Land is owned by Third Door Media, of which Danny Sullivan is partner and chief content officer...

 stated that Google represented about 75% of all searches. In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007. As of 2006, Google had an 85-90% market share in Germany. While there were hundreds of SEO firms in the US at that time, there were only about five in Germany. As of June 2008, the marketshare of Google in the UK was close to 90% according to Hitwise
Hitwise
Experian Hitwise is a global online competitive intelligence service which collects data directly from ISP networks to aid website managers in analysing trends in visitor behavior and to measure website market share. The Hitwise product is a commercial platform whereby customers pay Hitwise a...

. That market share is achieved in a number of countries.

As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable markets where this is the case are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu
Baidu
Baidu, Inc. , simply known as Baidu and incorporated on January 18, 2000, is a Chinese web services company headquartered in the Baidu Campus in Haidian District, Beijing, People's Republic of China....

, Yahoo! Japan
Yahoo! Japan
is a Japanese internet company formed as a joint venture between the American internet company Yahoo! and the Japanese internet company SoftBank. It is headquartered at Midtown Tower in the Tokyo Midtown complex in Akasaka, Minato, Tokyo.-History:...

, Naver, Yandex
Yandex
Yandex is a Russian IT company which operates the largest search engine in Russia and develops a number of Internet-based services and products. Yandex is ranked as 5-th world largest search engine...

 and Seznam are market leaders.

Successful search optimization for international markets may require professional translation of web pages, registration of a domain name with a top level domain in the target market, and web hosting that provides a local IP address
IP address
An Internet Protocol address is a numerical label assigned to each device participating in a computer network that uses the Internet Protocol for communication. An IP address serves two principal functions: host or network interface identification and location addressing...

. Otherwise, the fundamental elements of search optimization are essentially the same, regardless of language.

Legal precedents

On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing
Spamdexing
In computing, spamdexing is the deliberate manipulation of search engine indexes...

 constituted a tortious interference
Tortious interference
Tortious interference, also known as intentional interference with contractual relations, in the common law of tort, occurs when a person intentionally damages the plaintiff's contractual or other business relationships...

 with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."

In March 2006, KinderStart filed a lawsuit against Google
Google
Google Inc. is an American multinational public corporation invested in Internet search, cloud computing, and advertising technologies. Google hosts and develops a number of Internet-based services and products, and generates profit primarily from advertising through its AdWords program...

 over search engine rankings. Kinderstart's website was removed from Google's index prior to the lawsuit and the amount of traffic to the site dropped by 70%. On March 16, 2007 the United States District Court for the Northern District of California
United States District Court for the Northern District of California
The United States District Court for the Northern District of California is the federal United States district court whose jurisdiction comprises following counties of California: Alameda, Contra Costa, Del Norte, Humboldt, Lake, Marin, Mendocino, Monterey, Napa, San Benito, San Francisco, San...

 (San Jose
San Jose, California
San Jose is the third-largest city in California, the tenth-largest in the U.S., and the county seat of Santa Clara County which is located at the southern end of San Francisco Bay...

 Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.

See also

  • Image search optimization
  • List of search engines
  • Search engine marketing
    Search engine marketing
    Search engine marketing, , is a form of Internet marketing that seeks to promote websites by increasing their visibility in search engine result pages through the use of paid placement, contextual advertising, and paid inclusion...

  • Search engine optimization copywriting

External links

  • Google Webmaster Guidelines
  • Yahoo! Webmaster Guidelines
  • "The Dirty Little Secrets of Search," article in The New York Times
    The New York Times
    The New York Times is an American daily newspaper founded and continuously published in New York City since 1851. The New York Times has won 106 Pulitzer Prizes, the most of any news organization...

     (February 12, 2011) - Technical tutorial on search engine optimization, given at Google I/O
    Google I/O
    Google I/O is an annual two-day developer-focused conference held by Google in San Francisco, California. Google I/O features highly technical, in-depth sessions focused on building web, mobile, and enterprise applications with Google and open web technologies such as Android, Chrome, Chrome OS,...

    2010.
The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK