Many obstacles can reduce web site crawability - and impact your chances of achieving a coveted spot on a search engine results page. Learn the SEO fundamentals from an industry expert and put his SEO best practices to work for you.

SEO Fundamentals
Improving Web Site Crawlability


By icrossingseo


Over the years, the growth in search engine usage has become increasingly popular. Consumers use search engines to do everything from research investments to buy shoes. With this in mind, ensuring your business has good visibility in search engines should be a critical element of your marketing mix.


The practice of search engine optimization (SEO) is concerned with maximizing the visibility of your business by ensuring its web pages appear frequently and prominently in Search Engine Result Pages (SERPs).



The goal of this marketing channel, and therefore the SEO best practices outlined in this article, is to create an environment where all businesses, both large and small, can achieve top rankings in SERPs on leading search engines for search terms that are relevant to their business.


Search engines assign weights to various elements on a page and use algorithms to determine the value of each element - and thus how pages will rank in the SERPs. These algorithms vary by search engine and are regularly modified, so there is no "silver bullet" to getting a page-one rankings. Instead, it's best to follow the engines' SEO best practices - and your own good judgment - in developing and maintaining your web site - and ultimately your brand.


This series of articles describes SEO fundamentals and is designed to help you not only understand how various elements of your web site are evaluated by the engines, but learn to optimize your web site in three specific areas (crawlability, relevancy, and popularity) for improved visibility.


Web Site Crawlability


To achieve visibility, search engines must be able to find the content of your site. Using programs known as "spiders, bots or crawlers," they are able to "read" both the on-page content and take cues from a site's programming language. These crawlers follow a very simple process; read content and follow links (e.g. crawling).


To achieve top rankings, it's imperative that spiders are granted access all the salient points of your web site. The following are some common factors that would prohibit a search engine spider from crawling your site and ultimately reducing visibility of your business on the web. Duplicate Content


Duplicate content exists when two or more URLs display the same content. Search engines often become confused as to which URL is the correct source. Many times a search engine will reduce rankings for web sites with duplicate content.




Although Flash technology creates a rich visual experience for visitors, it can prohibit spiders from effectively reading the content on the page. That's because content in a Flash file is broken into several components that is not visible in the HTML code of the page.




JavaScript is a useful technology that drives drop-down menus, enables add-to-cart functions and more. Search engines historically have ignored JavaScript code on web pages. This is not to say that JavaScript shouldn't be used, but using it in a way that won't prevent a spider from crawling the pertinent on-page content is important. Placing JavaScript in external files or in CSS navigation are both excellent ways to minimize obstacles created by on-page JavaScript.


User-action and cookie requirements


To gain an understanding of user-behavior, adhere to legal requirements and other reasons, many web sites require cookies and user-actions to declare location, age requirements, etc. This is common for regulated industries (financial services, pharmaceutical, energy, etc.). The issue this presents is that a spider is not able to crawl past this point. If the entry page to a site contains this type of obstacle, the search engine may be prohibited from indexing all the relevant content.


Another critical component of web site crawlability is the internal linking structure. Because search engine spiders only read content and follow links, it's important to develop an easy-to-follow internal linking structure that will enable all visitors to navigate your site and find all the critical points easily and quickly. SEO best practices suggest creating an easy-to-follow navigation and internal sitemap page to facilitate better crawling and provide a better user experience.


Once all web site crawlability issues have been solved, your site will be ready for optimization. The next article in this SEO Fundamentals series will cover relevancy - the second phase of any SEO program.


About the Author: Richard Chavez is an SEO Manager at iCrossing, a leading digital marketing agency, whodevelops search strategies for Fortune 500 clients in the financial services, publishing, CPG and pharmaceutical sectors.