Online Marketing Blog has always been about more than search engine optimization, covering digital public relations, social media marketing and plenty of other internet marketing topics. And yet, SEO is a core component of nearly every thing we do in our consulting practice.
For the web site owners and marketers fairly new to the site optimization game, we like to offer tips based on common questions overheard in the course of providing SEO and internet marketing services.
Question: “Ensure your site is crawlable” What would be, in your opinion, the basic conditions that need to be reunited for a website to be crawled?”
“Crawlable” means the links to and within your web site can be discovered and followed by search engine spiders. Spiders or bots are programs that search engines send out to find and re-visit content (web pages, images, video, pdf files, etc). If a search engine spider cannot follow a link, then the destination page will either not be included at all, or exist in the search engine’s database but not be included in the universe of web pages available to search results.
A few of the common issues that can make it difficult for search engine spiders to crawl a web site effectively include:
- Navigation links embedded in Flash – Spiders for most search engines do not typically crawl links within Flash files, although Google reports progress in improving Flash indexing
- Embedding site navigation links within forms – Most search engine bots cannot fill out forms (except sometimes Google). If the user has to select and item from a drop down menu or fill in a form field to see content, that content is unlikey to be discovered and indexed by search engines.
- Lack of authoritative links into the web site. Search engines discover new web sites through links. Links from one site to another convey important information about the link destination and influence rankings. A lack of relevant links to the home page and interior pages of a site coupled with other factors doesn’t make the site “uncrawlable” as it does unlikely to be crawled any time soon or often.
If your site has say, 500 pages, but only 350 are getting crawled and included in the search engine’s public index, that means 150 pages are not working for you to attract traffic.
To solve these issues you can:
- Create alternative navigation with text links elsewhere on the web page, either in the footer and/or in breadcrumb navigation
- Create HTML site map pages made up of 100 or less text links to important pages on your site. You can have more than one sitemap page for sites larger than 100 pages
- Provide search engines with a XML site map list of all the URLs from your web site that you would like crawled. This does not guarantee all URLs will be included, but it can supplement what search engine spiders find on their own. There are also useful reporting options.
- Encourage inbound links from authoritative web sites to your home page as well as to important (and linkable) content within the site. Cross link with anchor text between pages and make it easy for customers and spiders to reach all areas of the site easily.
If you’re developing a new web site, it’s important to make sure your web developers are not only designing and planning the site architecture for functionality, user experience and ease of site maintenance, but also for search engines. Getting SEO consultants or good SEO advice at the very beginning of a web site project can save considerable headache and expense down the road.