To better understand SEO!

Optimize Website Structure for most Search-Engines

In today's busy life, every developer creates an easy and quick method to work it as a standard for own websites. Altough it might look a little time consuming, being more attentive and dedicate more time to it can make a big different for both search-engines and search-engine users searching on the net. Before starting it is also necessary to give dedicate some time to select the website hosting and domain name. When selecting hosting for a website, one must make sure to select a reliable hosting company with positive feedback from past clients which all together gives credibility to search-engines. On the other hand, while selecting the domain name for the website, it is important to identify the main keyword to represent the website. The main keyword/s may also be the company name, preferably only if the company name is the first researched word users search for to make use of the website.An example of such website would be where users would use the term toyota directly instead of searching car. 

Creating the Web Site structure


As explained in the previous article  Keywords and On-Site Factors for Search-Engine Optimization, the is suggested to create the navigation system of the website with hyper-links instead of using buttons, flash or JavaScript. Relative URLs are not recognized as back-links by search-engine spiders, therefore it is best to use absolute URLs to create as many back-links as possible, even if they are coming from internally (same domain). An absolute URL refers to links like "" while relative URLs refer to links like "../viewProducts.html".  Having a descriptive anchor text gives search-engine spider crawlers an idea of what to find the linked page's content, thus general terms as "Click Here", "Continue Reading", should be avoided. Instead the Call for Action Statements are suggested, such as "More about Cats and Dogs". When creating links throughout the website, it is important to create a hierarchy between all the links. The main navigation menu should be at the top of the hierarchy and should not include sub categories. Main Pages should link to main categories while individual sub category contents should then link to related pages. A page should never link to irrelevant content pages since search-engines may have difficulty in determining the theme or topic of the content which can eventually effect that page's ranking. To avoid such confusion in the case the link is essential on page, a nofollow attribute can be used. The nofollow is the HTML anchor tag which tells search-engines not to follow or to index that link. It also stops a page from passing PageRank to the pointed page since Page Ranks are passed from page to page on the same domain. This may lead to one negative Page Rank to be passed along such as the Contact Us Page, About Page, or other irrelevant pages with regards to the web site's service. To use the nofollow anchor tag, one must simple add rel=nofollow in the normal hyper-link tag. Example: <a href=" rel="nofollow"> Cats </a>. The nofollow tag has been introduced by google few years back and was accepted by other search-engines but the other search-engines may not always consider such tag.


Creating a sitemap helps to better understand the content, layout and hierarchy of pages. It also enables search-engines to quickly find new or recently updated pages. Sitemaps may be easily created from or through Microsoft Visual Studio which offers a tool by selecting create/ new/ sitemap.

Make a Legitimate and trustworthy Web site

A lot of spammers automatically create numerous simple websites which do not contain any pages or content which represents the company since it would not exist. In order to help search-engine believe that the website is real, legitimate and trustworthy, an About the Company page and Private Policy page are required. An about the company page or Information page should contain a company biography , history and photos of who makes up the company and where it is situated/ found. Meanwhile, a Privacy Policy page should contain information which involves web site users, an example of such information may be, informing users that emails collected will be used for statistical use only and will not be published to any outsource website. If a Private Policy page seems hard to build, the following website helps in building Policies:

Robots.txt File

A Robots.txt is a simple text file which notifies search-engine spider crawlers which pages to follow or index. If this file is not created, it will be assumed that all pages are to be followed or indexed. Even if this does may not seem to be a problem, it is. Following or Indexing all pages, may lead to a potentially unexpected and undesired increase in server bandwidth, especially if large sized images are being used in followed pages. The Robots.txt file may also include a command to make it easier for search-engines to find the sitemap as it specifies the exact path by adding the following line:      "Sitemap:"

.htaccess File

The .htaccess file is a configuration file for Apache Web Servers used mainly to rewrite URLs & redirect traffic. It also offers various forms of protections such as banning specific IP addresses and creating password protected directories. The .htaccess may be created using a simple text editor and the command RewriteEngine On at the top of the page activates the mod_rewrite. Some useful features: Blocking Bad Bots
  • To disallow sites from retrieving content from the Website through the Linux Wget command, the following line is added:
    • RewriteCond %{HTTP_USER_AGENT} ^Wget [OR]
  • To prevent the use of the Wget to access the Website, the following line is added:
    • RewriteRule ^.*-[F,L]
Prevent Image Theft
  • Many sites tend to steal images by linking directly to the image on your Website. Even if the intentions may be harmless, this may increase the server's bandwidth. In order to prevent other sites to display images on their site by linking directly to your site, the following line is added:
    • RewriteRule \.(jpg|jpeg|png|gif)$ - [NC,F,L]
  • The mod_rewrite is used in the .htacess file to map URIs from one form to another using regular expressions. If a URI matches a pattern specified in the .htaccess file, the mod_rewrite rewrites it according to the conditions set.
  • Even if the website has been created, the mod_rewrite makes it possible to rewrite URIs or URLs in a more search-engine and user friendly way.
    • Rewriting Static URLs:
      • RewriteRule ^before.html$after.html
    • Rewriting Dynamic URLs:
      • RewriteRule ^([^/]*)\.html$ViewProducts.aspx?cat=$1 [L]
    • Example:
      • Before:
      • After:
Non-WWW and WWW links
  • Redirecting non-WWW and WWW websites is important in order to avoid the non-WWW/WWW canonical issue. This issue states that back-links to a certain domain which contain the WWW and others that do not contain WWW which take to the same domain, the ranking of both back-links is spread on two different entities. This leads to search-engine spiders to follow the separate URLs as it assumes they lead to two different websites.
  • In order to fix this canonical issue, one must set up a 301 redirect (also known as permanent redirect) in the main root directory as follows:
    • Open a new text file or the current .htaccess file, initialize the mod_rewrite by typing "RewriteEngine On" in the first line and save as a .txt
    • Redirecting non-WWW to WWW:
      • RewriteCond %{HTTP_HOST} !^www\.domain\.com$
      • RewriteRule (.*)$1 [R=301, L]
    • Redirecting WWW to non-WWW:
      • RewriteCond %{HTTP_HOST} !^domain\.com$
      • RewriteRule (.*)$1 [R=301, L]
    • Redirecting to a NEW DOMAIN:
      • RewriteRule (.*)$1 [R=301, L]
    • When completed, upload to the main directory and rename to ".htaccess" instead of "htaccess.txt"
For more .htaccess tools visit

Prev: Creating Website Content Next: Website Design for SEO
Designed by