Search What are you looking for?

SEO Basics: Ensure Your Site is Crawlable

Posted on Oct 7th, 2008
Written by Lee Odden
In this article

    Ready to elevate your B2B brand?

    TopRank Marketing drives results with content, influencer, SEO & social media marketing.

    Online Marketing Blog has always been about more than search engine optimization, covering digital public relations, social media marketing and plenty of other internet marketing topics. And yet, SEO is a core component of nearly every thing we do in our consulting practice.

    For the web site owners  and marketers fairly new to the site optimization game, we like to offer tips based on common questions overheard in the course of providing SEO and internet marketing services.

    Question: “Ensure your site is crawlable”  What would be, in your opinion, the basic conditions that need to be reunited for a website to be crawled?”

    “Crawlable” means the links to and within your web site can be discovered and followed by search engine spiders.  Spiders or bots are programs that search engines send out to find and re-visit content (web pages, images, video, pdf files, etc). If a search engine spider cannot follow a link, then the destination page will either not be included at all, or exist in the search engine’s database but not be included in the universe of web pages available to search results.

    A few of the common issues that can make it difficult for search engine spiders to crawl a web site effectively include:

    • Navigation links embedded in Flash – Spiders for most search engines do not typically crawl links within Flash files, although Google reports progress in improving Flash indexing
    • Navigation links embedded in JavaScript or Ajax – Again, spiders like Googlebot have historically had issues with crawling links embedded in JavaScript menus but have made some progress. Links within Ajax web pages are still problematic for crawling.
    • Embedding site navigation links within forms – Most search engine bots cannot fill out forms (except sometimes Google). If the user has to select and item from a drop down menu or fill in a form field to see content, that content is unlikey to be discovered and indexed by search engines.
    • Lack of authoritative links into the web site. Search engines discover new web sites through links. Links from one site to another convey important information about the link destination and influence rankings. A lack of relevant links to the home page and interior pages of a site coupled with other factors doesn’t make the site “uncrawlable” as it does unlikely to be crawled any time soon or often.

    If your site has say, 500 pages, but only 350 are getting crawled and included in the search engine’s public index, that means 150 pages are not working for you to attract traffic.

    To solve these issues you can:

    • Create navigation elements with search engine friendly CSS code that still offers much of the dynamic functionality often found in Flash, JavaScript and Ajax
    • Create alternative navigation with text links elsewhere on the web page, either in the footer and/or in breadcrumb navigation
    • Create HTML site map pages made up of 100 or less text links to important pages on your site. You can have more than one sitemap page for sites larger than 100 pages
    • Provide search engines with a XML site map list of all the URLs from your web site that you would like crawled. This does not guarantee all URLs will be included, but it can supplement what search engine spiders find on their own. There are also useful reporting options.

     

    Each major algorithmic search engine supports the sitemap protocol and offers services/tools for webmasters: Google Webmaster Tools, Yahoo Site Explorer and Microsoft Live Webmaster Tools

    • Encourage inbound links from authoritative web sites to your home page as well as to important (and linkable) content within the site. Cross link with anchor text between pages and make it easy for customers and spiders to reach all areas of the site easily.

    If you’re developing a new web site, it’s important to make sure your web developers are not only designing and planning the site architecture for functionality, user experience and ease of site maintenance, but also for search engines. Getting SEO consultants or good SEO advice at the very beginning of a web site project can save considerable headache and expense down the road.

    For some vintage crawler SEO advice, check out this post on improving site spidering from 2006 and of course there’s this SEO Basics article covering more bases than crawling.