Another important area you should investigate early in your SEO campaign is to make sure that your site design is search engine friendly.

Keeping The Web Spiders Happy

Search engines use software robots called spiders to search the web to find and record web pages to add to the search engines index.

Almost everyone is aware of the need to have an attractive and informative site to attract and keep the attention of their visitors, but many of us fail to also consider our site from the viewpoint of the busy little spiders who have to crawl the pages and report the results to the search engines.

Being software robots, spiders take no notice of your fancy Flash intros, stunning graphics, superb sounds, and your clever javascript applications. Instead, they have to feel their way blindly around your site, using only those invisible tools you have thoughtfully provided them. The easier you can make this task and the more closely you follow their specifications, the easier it is for them to do their jobs and at the very least they may spider your site more often, and in extreme cases, it can even affect your site rankings.

You should be aware that some spiders may not be able to read Flash-enabled pages, image maps, dynamic pages with too many parameters passed, password-protected pages and .pdf files. Many search engines, with Google leading the pack, are now able to spider and rank some of these problem cases, but others are not.  

Flash and Spiders

At the present time Google is able to not only able to read but to index and rank pages based on Flash (.swf file types), while other search engines can read but do not appear to index or rank based on the text content of Flash pages. The use of the <noembed> tag will help your Flash pages to be indexed and ranked by all the major search engines. Google will be deprecating Flash by the end of December 2020 and it’s time that you get it removed from your websites.

Next, let us take a look at designing pages for spiders.

SEO and Web Design – History

In the earlier days of search engine optimization, obtaining good search engine rankings was done mostly through the use of titles and meta tags. But as the search engines responded to misuse of these simple methods, now they are much more focused on-page content and other less easily controlled factors. But page content is, after all, the only thing your viewers see, and thus it still is important, perhaps more so to viewers than to search engines.

The days when webmasters could achieve high search engine rankings by inserting keywords in very tiny text, using text the same colour as the background, and by repeating the same keywords over and over (keyword stuffing) are long dead and gone. The search engines have responded to this trickery and may now penalize your site if they become aware that you are using very small or invisible text if you cloak your pages or any of the other tricks designed to fool them.

Note that I said if they become aware of it as I am of the opinion that this is something which is not detected algorithmically, but search engines often become aware through spam reports or periodically run searches to detect such spam.

They are also able to conclude that if a page of 1000 words contains the same word 200 times, it is likely to be spamming, and they act accordingly.

Web Page Design Guidelines

You want to enrich your pages with keyword filled content. You want to put your most important information as high up on the page as you can. But you need to present your content in a balanced and ethical manner. Attempts to do otherwise can – and will – be held against you.

Remember too that your page content will be seen by your viewers and should be attractive, informative, and written to sell your product or service. You have to balance the content of your pages to serve two demanding masters; the search engines and your customers.

With that in mind, let’s take a look at some techniques that you can ethically use to optimize your pages:

  • Always opt for Header content as opposed to a larger font when the text contains a keyword. Header Content is considered more descriptive of what is actually on the page.
  • Place your most keyword-rich content as high on the page as you can. As a general rule, the first 25 words of body text will be heavily weighted. Don’t forget that the spiders crawl all the HTML on your page and thus may go through several hundred characters before it comes to your body text.
  • When creating textual links, try to make the links a keyword. Keyword links will garner much more credit than ‘Click Here.’
  • Add keyword-rich ALT text to your graphic links. Make certain, however, that the ALT text is descriptive of the link destination, and that you do not use the same word for your ALT text throughout the page. Repeating the same ALT text on every link is fairly easy for a spider to spot.
  • Some search engines look for patterns and proximity in your text and rank pages where your keywords occur close to each other.


Investigate and understand the correct use of alt tags on your graphics (they are required if you want your pages to validate to W3C standards) and the use of title attributes on many of your page elements.

The title attribute is little used but it is the correct way to get those nice little pop-up tooltips that help your users understand what they are seeing, and in addition, they are read by the search engines.

For an example of tooltips generated by title attributes, you can hover your mouse over the graphics in the menu header bar.

Next PostWebsite Design Problem Areas





error: Alert: Content is protected !!