“If Ihad to pick only one thing to do with my web site to improve it’s search engine visibility, what would it be?”
It’s a question that comes up often and most people asking it expect that there’s one right answer. The trouble is, as web sites and situations are different, so is the “just one thing” answer. It really depends on the situation.
Picture this: A web site with 10,000 pages, 6 years old, 25,000 inbound links and publishes new content daily. Yet the site gets less than 5% of all traffic via organic search. Why might that be?
The “one thing” could be navigation that blocks search engine crawlers from finding substantial portions of the site’s content. It could be that the first 40 or 50 characters of each title tag is hard coded with the site name and nothing else or not title tags at all. It could be many things.
The challenge is to find out what the issue is, or more likely, what the numerous “things” are, that all contribute collectively to poor performance in search engines. Then, recommend a course of action to fix and maintain.
All that said, if I HAD to pick one thing, it would be to make sure a site is crawlable by search engine spiders. If a search engine can’t find the content, it doesn’t matter whether you use keywords in the right places and frequency, execute internal links perfectly or any other tactic. The content that can’t be crawled, simply won’t be included in search results. At least not in a meaninful or useful way.
Solving such a problem, if it exists, will not only help PR and news related content, but overall company web site visibility.
To see if a site is properly indexed by search engines, the site webmaster can make some comparisons between the known number of pages being published to the web and the actual number of pages found on a search engine such as Google. In addition to that, the site could be validated via Google’s Webmaster Tools which would then provide insight into any crawling errors by Googlebot, Google’s software that visits web sites to supply content for Google’s search results.
Most PR professionals are not in a position to be assessing the health of the web hosting server, content management system, templates or database that make the corporate site work. Ensuring a web site is being properly crawled and indexed should be handled by properly trained and experienced staff.
To help convince those staff to do this kind of work, be sure to check out the following articles on making your web site crawler friendly visit “Ensure Your Site Is Crawlable“, “SEO Tip: Let the Spider Crawl” and “6 Tips for Google Webmaster Tools“.
This post is the first Basic SEO Tip for PR practitioners in a series of 10, “Top Ten SEO Tactics for PR Professionals” to be published over the next 2 weeks. Read on for the next post, “PR tactics that affect SEO“.
I just got that myself yesterday regarding my forums as well as my blogs – it’s a very important point indeed.
Barbara
Funny that stumbled on this blog – I deal with PR for lovethoseshoes.com and thankfully, we’ve just begun the process of SEO on our site – for me it is honestly like opening up another world – so many questions I’ve had over the years are now starting to be answered! I think it is going to make a huge difference to our visitor numbers, the quality of those visitors and the sale conversions – brill tip on this blog for anyone in PR who is wondering why traditional offline PR isn’t working as well as it should
Thanks, Lee, As a seasoned marketing, PR and corporate communications professional who just launched my own firm and a new website… I find there is no end to the many things that “need” to be done “now”. Your blog article helps crystallize the list. I’ll watch your blog for continued tips.
Joanne Maly at Lincoln Maly Marketing in Cincinnati, Ohio
Thank you Joanne. As awareness builds and an appreciation for what needs to be done increases, so does the need for prioritization. I hope the ensuing series of posts about SEO for PR helps with that. 🙂
Another important factor to consider is that even though a website is 100% search engine friendly, has a intuitive anchor text optimized sitemap and a XML sitemap search engines might still not index all of the site pages. The main reason for this is that search engines crawl the web at an average rate per day for all the trillion indexed urls in their database, hence crawling and indexing a site with millions of pages will take lot more than 6 months to get all URls indexed as search engines break it down by phases. Sites this big are also required to have more than one XML sitemap per subdirectory to ensure an efficient crawling and indexing rate from search engines. One thing is for sure don’t expect search engines to crawl an entire site at once unless it is a small business or personal site. Furthermore, don’t misinterpret the fact that you don’t see all of your URLs indexed in search engine databases as it might very well be that search engines are crawling your site by phases and indexing pages accordingly.
Cheers,
Shertmnan Lopez
Thanks Sherman. I am not sure I’d underestimate Google’s ability to crawl a large site. Even so, there are few sites that have so many pages that Google would need expansive amounts of time to crawl. The priority is set higher if there are a substantial number of deep inbound links to the site. Registering with Google Webmaster tools will reveal what’s been indexed and what has not as accurately as any PR professional dealing with SEO would need to know.
A site that large and that old should not be getting most of its traffic from search. What I feel is missing from your example is a target level for search referrals. What’s the point where the search engine optimization is declared to be “successful”?
If the site is relevant only to dead or dying query spaces, it cannot expect much traffic from search. Maybe 5% of site traffic is all it can expect. Some sites simply don’t evolve.
I think the hypothetical scenario needs to be fleshed out.
An experienced SEO would be tempted to drill down into specifics of a hypothetical, but I think the point is lost in that Michael.
The advice in this post centers around one of many potential search visibility issues that public relations professionals should be aware of. SEO is still so new to the PR industry, we’re building awareness first. The majority of PR people I talked to yesterday at a PR conference didn’t know what crawling or indexing issues were.
Now that they do, they can tackle the specifics like the keyword demand issue you’re bringing up.
Lee –
Fantastic post, and a great idea for a series. In so many way, PR and search are merging, and most PR practitioners just have never been taught SEO basics.
I completely agree with site crawlability. Without it, no other efforts are going to pay off.
Good stuff. I’m looking forward to reading the rest.
Best regards,
j
Thanks Jason. In his keynote, Brian Solis said the same thing regarding the convergence of disciplines. My motivation is to educated PR practitioners on the basics of SEO so they’re better empowered to win budget for SEO efforts, possibly bringing in outside resources like TopRank to do one time audits or ongoing work as needed. For many others, it will be the stimulus to invest in SEO resources in-house.
Either way, the company wins with better search visibility.
A crawlable site, so simple, yet so elusive. Thanks for the insight.
Raza Imam
“a die-hard Blackberry user with a soft spot for the Palm Pre”
Great Site! Thanks for the new views on the subject. It was extremely helpful.
BusinessHQR.com
hi,
I just come accross this blog today. Your post very usefeul for me. I have to read more about how to increase my PR. Thanks for your post. really help
As a SEO consultant for my business at internettoolsu.com I run into this problem all the time. Having many many pages that are not crawlable seems to be a common problem with website owners. I see this happen in particular with sites that are put up fairly quickly and when the spiders come in to index the pages they get lost in the navigation. I kind of look at this like people surfing and getting lost in the original intent and spiders do get distracted with how navigation menus are setup. My recommendation to this problem to individuals who getting started in creating a mega site is to make sure each page as it is being built is pinged or somehow linked so that the search engines know how to find it and get it indexed quickly before adding any new pages. Google seems to like a growing site, I don’t see good results for website owners that notify Google after a mega site is built. Slow and steady works the best. As for already established websites it becomes a lot of work getting those inner pages indexed.
I just come accross this blog today. Your post very usefeul for me
Another important factor to consider is that even though a website is 100% search engine friendly, has a intuitive anchor text optimized sitemap and a XML sitemap search engines might still not index all of the site pages. The main reason for this is that search engines crawl the web at an average rate per day for all the trillion indexed urls in their database, hence crawling and indexing a site with millions of pages will take lot more than 6 months to get all URls indexed as search engines break it down by phases. Sites this big are also required to have more than one XML sitemap per subdirectory to ensure an efficient crawling and indexing rate from search engines. One thing is for sure don't expect search engines to crawl an entire site at once unless it is a small business or personal site. Furthermore, don't misinterpret the fact that you don't see all of your URLs indexed in search engine databases as it might very well be that search engines are crawling your site by phases and indexing pages accordingly.
Cheers,
Sven Mathura
Another important factor to consider is that even though a website is 100% search engine friendly, has a intuitive anchor text optimized sitemap and a XML sitemap search engines might still not index all of the site pages. The main reason for this is that search engines crawl the web at an average rate per day for all the trillion indexed urls in their database, hence crawling and indexing a site with millions of pages will take lot more than 6 months to get all URls indexed as search engines break it down by phases. Sites this big are also required to have more than one XML sitemap per subdirectory to ensure an efficient crawling and indexing rate from search engines. One thing is for sure don't expect search engines to crawl an entire site at once unless it is a small business or personal site. Furthermore, don't misinterpret the fact that you don't see all of your URLs indexed in search engine databases as it might very well be that search engines are crawling your site by phases and indexing pages accordingly.
Cheers,
Sven Mathura