Introduction
Search Engines have developed into the Internet's most popular and powerful source of information, accounting for an estimated 80% of the Internet's traffic (Heche, 2007, p. 1). As a result, website owners are realizing the power in such devises and are shifting marketing budgets into the optimization of their sites specifically for search engines. During the toddler years of search engine optimization (SEO), crafty developers took advantage of weak search engine algorithms to display their websites in top results, regardless of their site's relevance. However, as more advanced Internet search engine technologies emerged to solve such exploits, new SEO methods were pursued (Boykin, 2007, p. 1). With the growth in search engine popularity and accuracy, and with newly emerging techniques used to target such engines, SEO has become a cut-throat competitive industry that is quickly being dominated in its utilization by big-business corporations (Murray, 2007, p. 1). Regardless of a company size and status, however, company webmasters with basic knowledge of HTML and blogging can establish top search engine rankings for websites that target niche markets with great efficiency by employing specific on-page and off-page SEO techniques.
A Search Engine Primer
Search engines did not become popular overnight. In fact, it took half a decade for the general public to catch on to the power of them. Search engines have become a woven part of society only because of their brilliant architecture; systems with frameworks so complex, yet so simple in user utilization that a novice can operate it. Concisely, the modern search engine is an intricate tool formulated to minimize the discovery time of information by minimizing result digression and maximizing result accuracy based on hundreds of relative factors. The basic functionality of a search engine includes content discovery, indexing, querying, and ranking (Fishkin, 2007, p. 4).
Content discovery is often referred to as "Web crawling". The analogy of the "Web" is an important concept to grasp, since its analogy will act as a backbone to understanding the search engine discovery process and the terminology involved. The internet can be generally referred to as the World Wide Web, or just Web for short, because the structure of the internet most resembles the structure of a spider web (Davis, 2005, p. 1). Each of the billions of pages of content are linked together in some way or another to create an incomprehensibly large network of connections. Consequently, search engines have called their automated programs that crawl this web "bots" or "spiders". Modern crawlers revisit indexed sites on a regular basis to look for changes or revisions. Sites are normally updated by the crawler between a one or two months time. In estimation, search engines have only crawled about half of the Web's content pages, accounting for between eight to ten billion pages (F ishkin, 2007, p. 4).
Every page that is crawled on the Web by a search engine is placed into a gigantic database called an index or sometimes a catalogue. Massive organization is applied to the index in a way that requests can sort through billions of pages and find relevant matches within just fractions of a second. Sometimes it can take a considerable amount of time for a search engine to actually index a site after crawling it. During this time, the site will not be available on index to those searching (Fishkin, 2007, p. 4).
Content querying is the provision of an interface or gateway connecting the human user and the results waiting inside the search engine database. The results vary in type from web pages to online published word processor documents, and are returned to the user based on the criteria they indicate. The method a user might use to indicate criterion varies based on the search engine. Search engines normally provide a blank text input field in which the user can type terms or phrases into then press a button to send the query to the search engine for processing. Many modern search engines incorporate exclusive input syntaxes that a user might learn to take full advantage of the search engine's power. Natural language searches, however, allow a user to input full sentence-structured questions instead of requiring the user to learn query syntaxes (Sullivan, 2007, p. 1). An example of syntax is placing terms in quotations. Google, the most commonly used search engine of today, uses q uotations to specify results that return exact matches to all the terms in the order they are listed in quotations. Google includes ten other operators used to better define a query and home in on the target results (Google Cheat Sheet, 2007, p. 1).
Ranking becomes a search engine's most distinguishing process, as this will determine what and how information is displayed to the user. A commonality all search engines share by nature is the organization of pages by relevancy starting first with most relevant and ending with least. The higher a page's rank is, the higher the site's probable relevance will be as perceived by the engine. Every search engine uses its own unique method of determining how pages rank in relation to one another, and these are called algorithms. An algorithm is a mathematical formula that will take into consideration dozens of factors that have positive and negative effects on page rank. Think of it as a set of rules that a judge uses to determine which girl wins in a beauty pageant. The winner will always showcase more than just beauty alone, but instead, indicate a deeper purpose like the reputation, talents, and even life intentions. The many factors involved in judging contestants in a beauty p ageant are very much like the factors used to rank a webpage (Sisson, 2006, p. 12).
Brief History of the Search Engine
The earliest breeds of search engines were not actually search engines at all, but rather massive directories of content pages manually submitted by their authors. It was not until spiders and bots came to the scene that people began to see the power behind such tools (Wall, n.d., p. 8). Archie appeared in 1990 as the very first tool used to search pages of the Internet. It was named Archie to resemble the word "archive" without the "v". Built by Alan Emtage, the program indexed directory listings from public FTP sites. An alternative tool emerged a year later called Gopher, which indexed solely text files instead of all computer files. Two other index systems called Veronica and Jughead searched the Gopher index servers and provided more targeted keyword search (Wall, n.d., p. 2). By 1993, a new generation of search engine emerged from a student at MIT: automated web crawling. Initially used for counting and measuring the size of the Web, the first web crawling bot on the In ternet was named the World Wide Web Wanderer by its creator Matthew Gray. ALIWEB (Archie-Like Indexing of the Web) was introduced in the same year with the capability to collect page meta-data and allow page authors to submit their own content. Search engines and crawling technology wasn't yet seen as having any true significance for society until further university experiments were done (Wall, n.d., p. 1).
As the Internet gained popularity and started appearing as a business opportunity to investors, college students began getting large funding opportunities. This boom in funding caused break-through developments such as relevancy-based indexing to occur. Corporations like Altavista, Ask Jeeves, Lycos, Yahoo, and Google in turn met at the search engine scene, each bringing their own new innovations to the table. Altavista offered a brand new method of searching for the end-user known as natural language inquiry (Wall, n.d., p. 1). Ask Jeeves was quick to mimic this technique, but also focused on building its index from web communities. A few years later, Altervista was bought by Yahoo! for 235 million dollars, which was just one of that many small steps taken toward the multi-billion dollar establishment Yahoo! is today (Olsen, 2003, p. 1). Lycos contained the largest index of any search engine of its time with more than 60 million documents in 1996, but eventually evolved into the fifth most popular web portal in the world (Sherman, 2002, p. 1). Lycos abandoned its own search engine algorithm, and began powering its search feature by Ask in 2006, which is former Ask Jeeves (O'Reilly, 2006, p. 1).
Although Google entered the scene relatively late in 1998, it still managed to ultimately come out on top from its tough search engine competitors (Google Milestones, 2007, p. 1) Through collaboration, Larry Page and Sergey Brin babied their creation until receiving more than 25 million dollars in funding in just a year's time from its initial launch (Google Milestones, 2007, p. 3). Google partnered with AOL and Yahoo! by early 2000, which also marked the release year of the renowned Google toolbar (Google Milestones, 2007, p. 4). In 2007, Colvin of CNN reported that "Google's figure is $149 billion and rising fast, pushing the company past most of America's biggest, most successful, most respected corporations" (Colvin, 2007, p. 1). It is clear that Google has conquered the search engine war, rendering it as the most valuable search engine webmasters can optimize for their websites. Google has practically set the standard for other search engines that have followed the leade r's footsteps. Because of this, Google-specific page ranking factors are currently the most significant for any SEO venture because of competing search engines' inherent similarities (Ryan, 2006, p. 1).
On-Page Search Engine Optimization
Jumping straight into SEO, it is imperative to understand that success relies heavily on the keywords that are chosen for the optimization venture. Because keyword terms can be found inside content, titles, headers, and images of a webpage, these are all considered on-page objects and therefore contribute to the optimization of the page itself. Keywords can be thought of as the foundation upon which SEO is built on, when if removed from the equation it leaves a broken structure. In relation to SEO, keywords are terms used to define the purpose of a webpage in its entirety (Fishkin, 2007, p. 9).
Commonly, there is confusion between metadata keywords and content keywords. metadata entries, which are code strings placed in the code heading of pages, are no longer used for relevancy because they were taken advantage of by having irrelevant keywords that attracted undeserved attention. For this metadata keywords are no longer used, while metadata descriptions are only used as snapshots for a few rare search engine directory page entries. Because of all this, metadata entries are very insignificant to SEO. In the world of keywords, content is king. When a search query is sent, the search engine will try to return with pages that match best to the inquiry keywords found within a page's content (Sisson, 2006, p. 8). Since so much relies on keywords, it is common practice to conduct research to seek the right related keywords or keyword combinations that are optimizable for a given scenario. There are several free online tools available for keyword research, such as the tool suite found at http://tools.seobook.com/keyword-tools (Callen, 2005, p. 32)
Keywords that are too popular will actually have a negative impact on search rankings because of the overcoming competition. Instead of seeking popular solo keywords like "insurance" or "games", it is much more effective to find a niche (Callen, 2005, p. 12). A niche homes in on the specific product, idea, or service that is attempting to be displayed in search results. When optimizing for a focused target audience, the competition is easier to outsource, and in turn will always promise high rankings when page optimization is established. Instead of seeking a single magic keyword, it is best to seek a keyword combination or a phrase that will best describe a niche specifically. Most people enter 2-5 word phrases into search queries, which ensures security with multi-keyword niches (Sisson, 2006, p. 13).
Some webmasters have tried repeating their keywords excessively on their pages to boost frequency. What these webmasters might not understand is that excessive keywording is like playing with fire, where if they get too close they will get burned. If a search engine notices an unusually excessive repetition of keywords, the engine will demote the site and may even ban it from its index completely. In contrast to this, search engines are now intelligently seeking common relationships between terms on the Web, so when keywords are used throughout a document with fluency and in good context, this can quickly benefit a site's ranking (Fishkin, 2007, p. 9).
Keywords should be strategically placed on a webpage to maximize keyword frequency without running the risk of being seen as a keyword spammer by the search engine. If more than one keyword combination is being targeted by your site, it is important not to strand keywords together in an attempt to increase keyword relevance. In page content, header code tags will emphasize keywords for users as well as search engine spiders. Placing keywords naturally in the alt tags of content relating images will also boost page relevance, and return your site in image search results. Most importantly, naturally mentioning keywords in body paragraph text will increase keyword frequency. To reiterate however, it is important not to overuse keywords in body paragraphs, since some search engines might suspect a site with that sort of 'keyword juicing' as spam (Sisson, 2006, p. 13).
Linking is another imperative factor of page rankings which will be covered in greater detail in off-page techniques, but is also a part of on-page optimization. Internal linking generates a hierarchy of synonymous page rank based upon which pages are linked most. Many webmasters often do not realize they are making a mistake when chain-linking content more than two levels away from the homepage, or mesh linking. Mesh linking occurs when every page contains a link to every other page in the site, giving every page with equal importance. This means a contact or form page will rank just as high as the actual meat of the site. To solve this issue and direct the search engines' focus towards pages of importance, a hierarchical linking system should be established. To create linking hierarchy, not all pages are cross linked, and important pages are linked to by the largest number of pages on the site (Sisson, 2006, p. 37).
Off Page Search Engine Optimization
While on-page optimization provides a solid basis for a website being recognized by spiders, it is the links from other websites that determine the rank of the recognized page. Off-page search engine optimization is mostly concerned with this establishment of inbound links to the focus website. The process is known as link building, and is by far the most strenuous aspect of SEO. A site's page rank is determined by both the quantity and quality of its incoming links. The quality of a link is the most weighted factor, which is based upon the page rank of the site making the link. If the linked site has relevance to the site being optimized, then this is a positive detail (Fishkin, 2007, p. 26). Relevancy is determined by comparing keywords in website titles, the anchor text of the link, and even its IP address. The IP address or number value that the domain name refers to may have less weighted effect on page ranking if it shares a common third octet (Sisson, 2006, p. 43). Sites that have very high page ranks are referred to as 'authoritative' and will almost automatically boost the page rank of a site it links to. Two forms of linking exist: two-way and one-way.
Two-way linking is also known as reciprocal linking because it is a mutual establishment between site owners. This method is essentially a link swap. Some webmasters carry the misconception that paying for well known link exchange services will guarantee site visits, but this is only true on a temporary degree (Sisson, 2006, p. 54). Also, link exchanges are considered manipulative and have a record of incurring removal of sites from search engine indexes.
One-way link building can sometimes be considered a science and art, since many techniques are nothing short of brilliant. One scheme often used to build massive amounts of inbound links is to produce a gadget or banner that appeals to other site owners, and encourages them to take a code snippet for the gadget or personalized banner and place it on their own site. An example of this method is evident at www.nerdtests.com. This site offers a free and fun online quiz that ranks the user's nerdiness in percent relation to everyone else who took the quiz and awards an 'official title' banner code based on the outcome. These banners can be found floating all around the net in user signatures of online community forum boards or even on personal blogs and provides www.nerdtests.com with an endless link base (Spencer, 2007, p. 1).
The most common and reliable method of getting back-links is submitting articles to informative websites, which usually give authors an opportunity to link to their personal site. Social bookmarking sites like Digg, Del.icio.us, StumbleUpon, and Propeller all have recently become a hit sensation among frequent internet users. These bookmarking sites provide a portal to sites recommended by other users. If the content on a site is valuable or entertaining enough to people, social bookmarking sites may be the most effective approach to off-page optimization since they are based on popularity and massive viral tendencies (Hagen, 2007, p. 5).
Method
As my primary research, I conducted an interview on October 25, 2007, consisting of ten focused questions about SEO with Bill Slawski. Bill is the President of SEO by the Sea and the Director of Internet Marketing for KeyRelevance Inc., and was directly referred to me by Rand Fishkin, one of the world's most renowned and authoritative SEO experts. Bill is one of the founders and administrators of Cre8asite Forums, is an active correspondent for Search Engine Land, and writes a weekly column for their small business section. Mr. Slawski's professional credentials substantiate the validity of his interview responses and provide access to exclusive insider industry knowledge. The interview was completed via electronic mail, in which Bill took full advantage of to respond with in-depth and intuitive answers complete with real-world examples.
Results
As the first question of my interview, I asked Bill how he would define SEO to the average Internet user. Bill responded, "In simplest terms, Search Engine Optimization (SEO) is applying knowledge of how search engines work to make sites easier to find on the web for the audiences that those pages were intended to attract. In more complicated terms, SEO is a matter of combining an application of marketing ideas and a knowledge of search engines to help bring the right people to a site so that they will change from visitor to consumer." This description spells out the fundamental concepts of SEO. It is important to understand SEO, in essence, is limitless in ways of targeting consumer markets. While advertising schemes might be limited to the specific targets the advertising company provides based upon the small amount of information shared with them about your product, SEO delves into your niche and allows for much more targeting flexibility.
My second question for Bill was, "In what ways is SEO more effective or efficient than other online marketing methods?" He responded saying, "Search engine optimization means being aware of how search engines might collect information from the pages of a website, and making it easy for the search engines to index the content of those pages. In effect, it means enabling a search engine to become an index for the pages of a site. It can be less expensive than using the paid contextual ads that you see displayed with search results at a search engine, or the banner ads that show up on other websites that may point to the site advertised." This cost efficiency is an important component of why business professionals are making the leap away from conventional pay per click advertising as their primary marketing strategy and making the switch to search engine optimization.
The third question pointed towards the public view on this experts own industry. My question was, "Do you think the power of SEO is relatively undermined, or in contrast, do you think it is overplayed as an online marketing method in the industry?" He responded with, "Search is one of the commonest activities that people get involved in when the go online, so making a website easy to be found in a search engine for people who might be looking for what that site has to offer is a good idea. It can make sense to include SEO as one part of a multiple part marketing effort, and to build a strong marketing plan that includes both online and offline parts. Unfortunately, there are differing skill sets amongst people who offer SEO services - some are just better than others." While my question was meant to explore SEO's public relation solely, Bill brought up an extremely important reiteration of SEO's unknown power when combined with an ultimate market plan encompassing offline tar get markets as well as the ones who exist online. This also adds to the concept of 'limitless' SEO possibilities.
I asked for my next question, "Can any website benefit from SEO?" Bill explained, "SEO is only really important to sites that want to increase their visibility on the Web. A game clan site, where everyone who needs to know the address of the site already does has no need for SEO. But, if you hope to attract visitors to your pages, it doesn't hurt to make them as search friendly as possible. And if you want to attract people to those pages who might be interested in the content of the pages, it doesn't hurt to try to use words on the pages that those people might try to search with on a search engine, and to do it in a manner that makes it more likely that those words will be found earlier on in search results." Drawn from his answer is a suggestion of widely conventional use for SEO. Unless meant specifically to be concealed from Internet users, any website seeking visitors can benefit largely from any amount of SEO. Since amateur implementation of SEO is key part of this pap er, the next question was very distinctive in terms.
I asked, "Is it possible for webmasters to (with fundamental knowledge of HTML and blogging) implement SEO for themselves with relatively successful results for their small websites?" Unsurprisingly, my prediction was reinforced with his answer. Bill said, "Webmasters with a fundamental knowledge of HTML and blogging can achieve some success with being found on the web, but having a good knowledge of how search engines work can help a webmaster make better choices about how their site is set up for success with search engines." I asked, "In general, what is the timeline of results returned by SEO?" Bill responded saying, "The amount of time that it may take to achieve results may vary by the site involved and how much work it might need, the competitiveness of the market it is within, and the demand for what the site offers. It's almost impossible to guarantee success generally, and perhaps even harder to do it within a specified timeline." While other sources have noted resu lts can be seen in a matter of days in some cases, it seems there is no definitive amount of time that promises results to become evident. In that, the SEO marketing solution may not always suite for website owners seeking instant Web traffic.
This question focused on SEO as a long-term asset. I asked Bill, "Do you believe SEO may become obsolete in the future?" He explained, "I don't see it becoming obsolete as much as I see it evolving. What we considered SEO in 1998 is different than what we consider it to be now. If you look at a set of search results in Google today, you may see videos, images, news, web pages, product searches, and other results that you wouldn't have seen even a couple of years ago. The web is changing and search engines are changing, and helping people so that they understand some of these changes and how they might impact their web sites will probably continue to be a need to be filled in the future." This provides a fairly straight answer indicating that SEO will only continue to progress with changes over time, rendering SEO as a very reasonable long-term asset for any website.
My final and most important question asked, "In what ways might SEO be viable for businesses with niches?" Bill responded saying, "Finding a niche where you can be competitive, and where there's a demand from consumers can increase your likelihood of success. A small business can often take advantage of working within a niche that a larger business might find to be too much work for too little return. If the smaller business has considerably less overhead in terms of cost and time, they may be able to thrive in one of those niches. By focusing upon a specific market or audience that others aren't, it may be possible to be found easier if people want to find the service or goods or information that you provide within that niche." This is a fabulous reverberation of how specific keyword combinations and niches interact. Focusing on smaller markets can provide a better means of success on a smaller, yet more attainable scale.
Discussion
Throughout the extent of my research, SEO had been discovered to be one of, if not the, most effective Internet marketing strategies available today. Statistics have shown that the largest magnitude of online users discover information and merchandise through the use of search engines. SEO channels that majority of Internet traffic directly into a marketable solution, idea, or product with the best cost and time efficiency. By employing on-page and off-page techniques, a webmaster with basic knowledge of HTML and blogging can supply a particular niche website with a top search engine result ranking respective to its niche search keywords. Keywords play an imperative role in the SEO venture by providing the base of the optimized structure. The keyword focus of a pre-optimized website is determined through intense research by identifying competition and analyzing keyword query frequencies using particular keyword research tools. After keywords are determined, on-page content st ructure and coding is the next priority, seeing as off-page link building logically requires a quality page to link to beforehand. Off-page techniques will utilize link building strategies to launch the rankings already established by on-page SEO past competition.
The product produced as a result from my intense research will enable any adventurous amateur with fundamental HTML and blogging familiarity to pursue SEO with relatively guaranteed success. My product, in the form of a website, guides the pursuer with simple and concise instructions. The website splits the SEO mission between on-page and off-page techniques which have been explained in earlier sections of this paper. Instead of discussing these techniques in non-applicable generality however, the website will demonstrate specific examples of each optimization practice with its own optimized features.
Are you from Germany? Is your IQ higher than Americans?
Sind Sie aus Deutschland? Ist Ihr IQ höher als die Amerikaner? Glauben Sie wirklich so denken? Beweisen Sie es
Saturday, August 13, 2011
Search Engine Optimization Guide for Webmasters
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment