Website Design Problem Areas

Written by Suresh Kalyanasundaram

16/01/2016

reach your audience
website design problem areas

Despite our best intentions, some of today’s most popular technology can be a major stumbling block for search engine spiders. Let’s take a quick look at these spider stumbling blocks, and provide some quick solutions you can implement to overcome them.

Website Design Problem Areas

  • Frames – In the past spiders were not able to read frames, but now almost all the major spiders can read them. If you do need to solve a problem for a particular spider, the quick solution is to utilize your “No Frames” tag content to optimize your page. It is also advisable to make sure that you use a base href tag in your header to help search engines understand better.
  • Password Protected Pages – are pages you probably don’t want to be indexed anyway. Just be aware that like a human, the spider cannot enter any area that is protected by a password.
  • Flash Sites – while beautiful, cannot be read by most spiders, but as of late 2004, Google is reading, indexing and ranking Flash pages based on the text content of the Flash. Your solution options are to use an entrance page that is keyword text phrase intense, create a two-frame frameset where one frame is only one pixel in height and use the No Frames area, or to alternate the use of static HTML. Posters have cited February 16, 2020as the date the site will allegedly shut down. The reason given for the shut down is that Adobe will be discontinuing support for Flash in 2020. Adobe has encouraged those who use Flash on their websites to begin converting to other newer formats such as HTML5. Many popular browsers such as Chrome, Microsoft Edge and Safari have already been blocking Flash by default, but in 2020 all support will officially end. — Source 
  • Image Maps – are something that can be read by some spiders but not by others. If you plan to use an image map, make sure there are other links on the page (perhaps on the bottom) that link to your other pages or better still to a site map that links to all your pages with good anchor text. 
  • Meta Refresh Tag – this tag has been so abused by the XXX industry that it is now considered spam by the engines. Perhaps it is not really a stumbling block, but the spiders have been programmed to run from it. 
  • PDF Files – also known as Adobe Acrobat Reader files, present a major stumbling block to most spiders. Some engines (specifically Google) are now, however, beginning to index this kind of page.
  • Dynamic Pages – Some search engine spiders have problems with dynamic pages that contain variables in the URL. This is most often seen with dynamic pages that use CGI, ASP, or Cold Fusion. Google, for instance, will not index pages shower URL contains id= followed by more than ten characters or if there are too many variables in the URL. If you are having problems with dynamically generated pages you should consider using the rewrite module of the Apache server to rewrite those dynamic URLs into static looking URL or using a similar add on if hosted on a Windows server. There are also PHP scripts that can be implemented which will change the address into a readable page. You can find information and help in accomplishing this at many of the SEO forums listed in the Search Engine Forums link.

Tips

While it is a relatively new standard and not used as much as it should be, Cascading Style Sheets (CSS) is a way to provide more freedom and flexibility in your page design, while at the same time providing many SEO advantages.

This site is built in CSS and may be considered as a modest example of what can be done using CSS that is difficult to do in a search engine friendly way using older design techniques.

Next PostHow Search Engine Rank Your Website

Liked the article? Share it with your friends!
SEMrush
Digital Marketing Course in Chennai