Search engine optimization is now on every webmaster’s mind these days. Getting a good rank for the right keywords can mean a steady stream of targeted traffic to your website. This kind of traffic is free, the key to getting a high search engine ranking is though correctly structuring in your website. This can includes having plenty of content that’s relevant to your keywords. Thus making your website spider friendly. This section is a checklist to make sure all your WebPages can be found, indexed and ranked correctly.
A domain name is key in getting a high rank in search engines. You should use at least two or three keywords in a domain name. This will show search engines that you website is relevant to what your website is talking about.
Make sure your site deals with an identifiable theme which is obvious from the text on the home page and are reinforced by all pages in your website. Make sure all the individual WebPages relate to each other and deal with various aspects of some central theme. The text in your home page should clearly state what your website is about and that the other pages reinforce that. That theme should also be identifiable with your domain name.
When the Spiders come to your website their looking for content. If a page doesn’t have much content, or the content doesn’t appear closely related to the pages title and your website’s theme, then your pages probably won’t be indexed or if they do get indexed they won’t rank well. Search engines love quality content and lots of it. Content is what Internet users are looking for and search engines try to provide it. So it all comes back to a theme.
Don’t make your important pages too deep in your directories. This means if it takes several clicks to get there from your home page. Search engines index your home page first (most of the time) then over time they start to index your other pages on your website. Many spiders are programmed to only go three layers deep - if some of your important content is buried deeper than that, it may never be found or indexed at all.
The title is one of the most important aspects of a webpage when it comes to search engines. Don’t use a generic title for every webpage you have. Use the keywords your targeting for that page. You should keep it brief but at the same time descriptive. If you picked A Domain name with the right keywords this will be a bonus for you big time. Try to use A keyword in your 2nd level directory and 3rd Level directory.
A Domain Name Example - 2nd & 3rd Level Domain Names
TOP LEVEL DOMAIN - domainname.com
2nd level directory - domainname.com/realestate.html
3rd level directory - domainname.com/losangeles/realestate.html
As you can see their are ways to include more keywords. Every page should use a keyword in their 2nd and 3rd level domain name.
This contains a highly descriptive sentence about the content and the purpose of your page. This should contain your most important keyword phrase early in the sentence. Remember not every search engine will display this, but many will. The art to getting this right is to use relivent keywords to your page, write it from a 3rd person point of view. No keyword stuffing. If it's number one, it'll just looks bad to users (real people). You need it to look good for search engines, but also for people. You need to find A middle ground. Don't forget to optimize for every webpage.
As with the Meta tag description, not every search engine will use the keywords Meta tag. But some will use it and none will penalize you for having it. Also, having a short list of the keywords you're targeting will help you write appropriate content for each page. The keyword tag should contain your targeted keyword phrase and common variations, common misspellings and related terms. Make sure your keywords relate closely to the page content and tie into the overall theme of your site. This will really help with Yahoo listing, Google Tends to ignore them altogether.
You’ll have to achieve balance. You’ll need to include keyword phrases (and variations) a number of times within your text. Not so many time that you appear to be guilty of keyword stuffing. The trick is to work the keywords into the text so that it reads as naturally as possible for your website visitors. Remember, it’s possible to incorporate keywords into any webpage element that is potentially viewable by your web site visitors. This can include header text, link text, titles, table captions, the "Alt" attribute of the image tag, the title attribute of the link tag, etc.
This is critical, if your pages can't be found, they can't be indexed and included in search results, let alone rank well. Search engines use spiders to explore your website and index your pages, so every page must be accessible by following text links. If pages require a password to view, are generated by a script in response to a query, or have a long and complicated URL, spiders may not be able to read them. You’ll need to have simple text links to the pages you want indexed.
It's a good idea to create a sitemap. Check out this section to leran how to create your own very easily.
Internal links help determine pagerank since they show which pages of your site are the most important. The more links you have to A page relative to other pages on your site, the more important search engines will think that page is.
When you create a text link to another page, make sure to use targeted keywords as your text for the link (inside the anchor tags that create the link). You should make it descriptive as possible.
If possible, don't use frames on any page you want to get indexed by search engines. If you feel you simply must use frames for a page, then also make use of the "no frames" HTML tags to provide alternative text that spiders can read (and make that text descriptive rather than just a notice that "This site uses frames etc. etc.").
Don’t make any pages that automatically redirects the visitor to another page. The exception is a page you’ve deleted for good – you should use a 301 redirect. This is a permanent redirect, which is acceptable to search engines.
Spiders can’t read content in jpeg, gif, or PNG files. If you really feel that using an image rather than text is critical then make sure to put some text in the image’s Alt tag, (or in the "title" tag if you're using the image as a hyperlink).
Important Content Should Not Be Contained In Flash Files
Flash is a wonderful technology, Search enngines are now just starting to be able to read flash files. Better to not place your important content inside a .swf file. When using flash you need to create a balance. better to use flash as a visual aid.
Don't stop at your home page. You should take time to optimize any page, which has a reasonable chance of being indexed by the major search engines, targeting appropriate keywords for each webpage. If you face a lot of competition it may be nearly impossible to get a top ranking for your home page, but you can still get lots of search engine traffic to your site from other pages, which are focused on very specific keyword phrases.
You site should have unique content on every page that distinguishes it from every other page on your site. Duplicating content or having pages that are only slightly different might be seen as search engine spamming (trying to manipulate search engine results).
Somewhere on your site you should state your policies about other people linking to your website and provide the wording you’d like them to use in their link. You want to encourage other people to link to your site, preferably using text and a description that reflect the keywords for that page. To make it easy on them provide the ready-made HTML code for the link – not everyone will use it, but many will as a courtesy as long as it doesn’t contain marketing hype.
Text links are better from an SEO standpoint than image links, as spiders can't read text from an image file. If you feel you really must use a graphic as a link, at least include a text description which (including the relevant keywords) by using the title attribute of the link tag.
HTML coding errors and non-working links can keep search engine spiders from correctly reading and indexing your pages. It a good idea to use a webpage validation utility to check your HTML code, that way you’ll know for sure it’s error-free.