Search Engine Optimization ABCs and FAQs

META-TAGS
META-TAGS are the part of your website’s HTML code that allows a search engine to easily identify your site’s content. In the infancy of search engine optimization, meta-tags were the primary vehicle marketers and coders used for attracting search engine traffic. Today, search engines examine more than meta-tags; they take into account site content, internal link structure, and link popularity. Yet nonetheless, meta-tags remain the cornerstone of a successful search engine marketing program. Three primary meta-tags help search engines determine the relevance of a website for a specific search phrase: the title tag, the description tag, the keyword tag.

The TITLE TAG.
First, there is the Title Tag. Your Title Tag must describe exactly what your webpage contains. It has to contain the keywords — descriptive words that describe what your website offers or does — for which you hope to be found. This Title Tag will be the first thing your user sees so it must be clear and simple to read.Keep the Title Tag 70 characters or less. Make sure that it employs words and phrases that describe the content of the page as accurately as possible. Make each and every Title Tag is unique and relevant, that every word in this tag is contained somewhere in the tag’s page.

TITLE TAG HINTS.

Designing Search-Friendly Web Pages:

There are two important points to remember about search engines:

1. The first is…The quality of your site counts. Search engines make their money through advertising. Showing ads to their users is their profit model, and the more users they have, the more money they make. The way search a search engine gets more users is by giving them the best search results. This means that if your site is the most useful site to customers in your category, then search engines want to rank you highly. Indeed, their livelihood depends on it.

2. The second thing to remember is…Search engines are computer programs. More precisely, search engines run a program called a spider than downloads your web pages, reads the text and links on those web pages, and decides what to do with your pages based on that information. This is known as crawling. Search engine spiders are computer programs that crawl web pages.

Employ a Flat Directory Structure
In general, the flatter your site’s directory structure, the better your chances of getting more of your pages spidered. In other words, pages that are several sub-directory levels deep will often get spidered less frequently. For example, consider the depth of the web page at the following URL:

http://www.yoursite.com/content/articles/2005/05/pages.htm

Closer inspection revels that the file page.html is six directory levels deep.

A page that deep should generally not expect to be easily indexed or rank highly in any search engine. Expectations maybe made for popular sites with lots of incoming links, but any web page buried that deep puts itself a ranking disadvantage.

It is not uncommon to use directories to organize the structure of a site. That is because a flat directory structure-one that places all the pages only tow or three levels deep-is very difficult to logically organize. If the structure of your site dictates that you must utilize deep sub-directories, then it becomes even more important to utilize a strategically placed site map.

Use a Site Map
A site map is a web page that links to every other page within your site. By creating a good site map and linking to from your homepage, you ensure that each [age on your site is only one click away fro your site map and only two clicks away from your homepage. This is the optimum structure in terms of making web pages easy for the search engine spider to find.

As you know, search engines spiders find new pages by following links fro the pages that are already in their index. Thus, if you want a spider to crawl a web page, you have to make sure that it’s linked to from a page the search engine already knows about.

However, unless you have a very small site, linking to all your pages from your homepage would look messy and unprofessional to your customers. The site map allows us to accomplish our objective of getting all of our pages indexed without cluttering our home page with links. By placing a link on your homepage to the site map, and then a link form the site map to all of the rest of your important pages, you make all your site’s pages easy to find and index.

The search engines themselves will tell you to use a site map to ensure that all your web pages get indeed. In fact, Google has its own site map program. Our SEO Tune Up covers the basics of good Search Engine Optimization practices.

About Victoria Stankard

Victoria Stankard has been an online SEO content writer for a variety of markets across the nation since 2006. Specializing in optimized content marketing strategies and owner of a successful organic search engine optimization company, Victoria writes for real people with "The Optimized Edge" - putting you on the map and more!

Comments

  1. Bazi Drink Dean says:

    I like your advice about always using a site map. It does make it easier to get everything indexed. Make sure to validate your code too. There are plenty of free cose validation utilities availabel for free online. Code validation is an extra step, but you don’t want to confuse those spiders, now do you?

Speak Your Mind