Lee Odden

SEO Basics: Ensure Your Site is Crawlable

Lee Odden     Online Marketing, SEO, SEO Tips

Online Marketing Blog has always been about more than search engine optimization, covering digital public relations, social media marketing and plenty of other internet marketing topics. And yet, SEO is a core component of nearly every thing we do in our consulting practice.

For the web site owners  and marketers fairly new to the site optimization game, we like to offer tips based on common questions overheard in the course of providing SEO and internet marketing services.

Question: “Ensure your site is crawlable”  What would be, in your opinion, the basic conditions that need to be reunited for a website to be crawled?”

“Crawlable” means the links to and within your web site can be discovered and followed by search engine spiders.  Spiders or bots are programs that search engines send out to find and re-visit content (web pages, images, video, pdf files, etc). If a search engine spider cannot follow a link, then the destination page will either not be included at all, or exist in the search engine’s database but not be included in the universe of web pages available to search results.

A few of the common issues that can make it difficult for search engine spiders to crawl a web site effectively include:

  • Navigation links embedded in Flash – Spiders for most search engines do not typically crawl links within Flash files, although Google reports progress in improving Flash indexing
  • Navigation links embedded in JavaScript or Ajax – Again, spiders like Googlebot have historically had issues with crawling links embedded in JavaScript menus but have made some progress. Links within Ajax web pages are still problematic for crawling.
  • Embedding site navigation links within forms – Most search engine bots cannot fill out forms (except sometimes Google). If the user has to select and item from a drop down menu or fill in a form field to see content, that content is unlikey to be discovered and indexed by search engines.
  • Lack of authoritative links into the web site. Search engines discover new web sites through links. Links from one site to another convey important information about the link destination and influence rankings. A lack of relevant links to the home page and interior pages of a site coupled with other factors doesn’t make the site “uncrawlable” as it does unlikely to be crawled any time soon or often.

If your site has say, 500 pages, but only 350 are getting crawled and included in the search engine’s public index, that means 150 pages are not working for you to attract traffic.

To solve these issues you can:

  • Create navigation elements with search engine friendly CSS code that still offers much of the dynamic functionality often found in Flash, JavaScript and Ajax
  • Create alternative navigation with text links elsewhere on the web page, either in the footer and/or in breadcrumb navigation
  • Create HTML site map pages made up of 100 or less text links to important pages on your site. You can have more than one sitemap page for sites larger than 100 pages
  • Provide search engines with a XML site map list of all the URLs from your web site that you would like crawled. This does not guarantee all URLs will be included, but it can supplement what search engine spiders find on their own. There are also useful reporting options.

Each major algorithmic search engine supports the sitemap protocol and offers services/tools for webmasters: Google Webmaster Tools, Yahoo Site Explorer and Microsoft Live Webmaster Tools

  • Encourage inbound links from authoritative web sites to your home page as well as to important (and linkable) content within the site. Cross link with anchor text between pages and make it easy for customers and spiders to reach all areas of the site easily.

If you’re developing a new web site, it’s important to make sure your web developers are not only designing and planning the site architecture for functionality, user experience and ease of site maintenance, but also for search engines. Getting SEO consultants or good SEO advice at the very beginning of a web site project can save considerable headache and expense down the road.

For some vintage crawler SEO advice, check out this post on improving site spidering from 2006 and of course there’s this SEO Basics article covering more bases than crawling.

PoorSo SoOKGoodAwesome (3 votes, average: 5.00 out of 5)

Lee Odden About Lee Odden

@LeeOdden is the CEO of TopRank Marketing and editor of Online Marketing Blog. Cited for his expertise by The Economist, Forbes and the Wall Street Journal, he's the author of the book Optimize and presents internationally on B2B marketing topics including content, search, social media and influencer marketing. When not at conferences, consulting, or working with his talented team, he's likely running, traveling or cooking up something new.


  1. Avatar Robert Stanley says

    Thanks for the post. I didn’t think to submit my site map to Yahoo or Microsoft. Good Stuff!

  2. SEO is a mystery wrapped in an enigma inside a puzzle, all controlled by voodoo.
    Besides tools like Google Webmaster Tools and such, I find that a complete site map helps immensely, as of course to incoming links and backlinks.
    Also useful in my experience is to ActionScript 3.0 has strong support for SEO. Can’t say the same for AJAX, though that depends on the library.

  3. Thanks for the simple explanation on crawlable web sites for beginners. This is a great resource for people starting out.

  4. I think another point can be – Bad/wrong implementation of robots.txt. If your pages are ‘roboted out’ engines won’t be able to crawl your site/pages properly.

  5. I just used a free tool called Website Grader at http://websitegrader.com to measure my site’s SEO effectiveness. It gave me great feeback that I used to make improvements. Nothing in it for me. Just thought I’d pass it along to try.

  6. Great advice! The one thing I see most of the time on site that are designed by graphic people, is no site map page. This one thing can make the world open up for your site.

    Lee Watters

  7. Great post! Just submitted my sitemap to Google. Good reminder that internal links are just as important as external links.


  8. Avatar Michelle / chelpixie says

    Lee, you keep providing extremely useful and interesting information and I keep sharing it with our readers in the daily vibe roundup. Thanks for sharing your wonderful content.

  9. Thanks Michelle, I’m glad you find the posts useful and thank you for sharing them with your readers.


  1. Vibemetrix Daily Vibe - Determining A Strategy and Getting Started with Social Media | VibeMetrix Blog says:

    […] Once you have the blog,

  2. […] Search Engine Optimization Basics Tips – Crawler Friendly Web Sites, Online Marketing Blog […]

  3. […] TopRank’s internet marketing blog on the intersection of digital PR, social and search engine marketing. Home  About  Resources  Archives  Subscribe  Contact  « SEO Basics: Ensure Your Site is Crawlable […]

  4. Vibemetrix Daily Vibe - Blog Series | VibeMetrix Blog says:

    […] Odden is back and is doing a series on SEO Basics.

  5. […] SEO Basics: Ensure Your Site Can be Crawled by Search Engines – Speaking of crawler friendly, this post is dedicated to addressing the most common issue for web site owners that are not doing well in search results.

  6. SEO Basics: Ensure Your Site Is Crawlable says:

    […] Question: