B2B Marketing Blog - TopRank®

B2B Marketing views, news and interviews.

Contact Us

MENUMENU
  • Services
    • Influencer
      • Start Your B2B Influencer Pilot
    • Content
    • Search
  • Insights
    • Blog
    • News
    • Resources
  • Our Work
    • B2B Technology
    • IT Service Management
    • Project Management Software
    • Social Networks
    • Supply Chain
  • About Us
    • Meet the Team
    • Careers
  • Connect

SEO Tactics For PR: If I Could Only Do One Thing…

Lee Odden
Lee Odden
Online Marketing, Online PR, Public Relations, SEO, SEO Tips

seo-tip-1

“If Ihad to pick only one thing to do with my web site to improve it’s search engine visibility, what would it be?”

It’s a question that comes up often and most people asking it expect that there’s one right answer. The trouble is, as web sites and situations are different, so is the “just one thing” answer. It really depends on the situation.

Picture this: A web site with 10,000 pages, 6 years old, 25,000 inbound links and publishes new content daily. Yet the site gets less than 5% of all traffic via organic search. Why might that be?

The “one thing” could be navigation that blocks search engine crawlers from finding substantial portions of the site’s content. It could be that the first 40 or 50 characters of each title tag is hard coded with the site name and nothing else or not title tags at all. It could be many things. 

The challenge is to find out what the issue is, or more likely, what the numerous “things” are, that all contribute collectively to poor performance in search engines. Then, recommend a course of action to fix and maintain.

All that said, if I HAD to pick one thing, it would be to make sure a site is crawlable by search engine spiders. If a search engine can’t find the content, it doesn’t matter whether you use keywords in the right places and frequency, execute internal links perfectly or any other tactic. The content that can’t be crawled, simply won’t be included in search results. At least not in a meaninful or useful way.

Solving such a problem, if it exists, will not only help PR and news related content, but overall company web site visibility.

To see if a site is properly indexed by search engines, the site webmaster can make some comparisons between the known number of pages being published to the web and the actual number of pages found on a search engine such as Google. In addition to that, the site could be validated via Google’s Webmaster Tools which would then provide insight into any crawling errors by Googlebot, Google’s software that visits web sites to supply content for Google’s search results.

Most PR professionals are not in a position to be assessing the health of the web hosting server, content management system, templates or database that make the corporate site work. Ensuring a web site is being properly crawled and indexed should be handled by properly trained and experienced staff.

To help convince those staff to do this kind of work, be sure to check out the following articles on making your web site crawler friendly visit “Ensure Your Site Is Crawlable“, “SEO Tip: Let the Spider Crawl” and “6 Tips for Google Webmaster Tools“.

This post is the first Basic SEO Tip for PR practitioners in a series of 10, “Top Ten SEO Tactics for PR Professionals” to be published over the next 2 weeks. Read on for the next post, “PR tactics that affect SEO“.

About Lee Odden

@LeeOdden is the CEO of TopRank Marketing and editor of TopRank's B2B Marketing Blog. Cited for his expertise by The Economist, Forbes and the Wall Street Journal, he's the author of the book Optimize and presents internationally on B2B marketing topics including content, search, social media and influencer marketing. When not at conferences, consulting, or working with his talented team, he's likely running, traveling or cooking up something new.

Comments

  1. Barbara Ling, Virtual Coach says

    April 30, 2009 at 6:46 am

    I just got that myself yesterday regarding my forums as well as my blogs – it’s a very important point indeed.

    Barbara

  2. Laura says

    April 30, 2009 at 7:20 am

    Funny that stumbled on this blog – I deal with PR for lovethoseshoes.com and thankfully, we’ve just begun the process of SEO on our site – for me it is honestly like opening up another world – so many questions I’ve had over the years are now starting to be answered! I think it is going to make a huge difference to our visitor numbers, the quality of those visitors and the sale conversions – brill tip on this blog for anyone in PR who is wondering why traditional offline PR isn’t working as well as it should

  3. Joanne Lincoln Maly says

    April 30, 2009 at 11:09 am

    Thanks, Lee, As a seasoned marketing, PR and corporate communications professional who just launched my own firm and a new website… I find there is no end to the many things that “need” to be done “now”. Your blog article helps crystallize the list. I’ll watch your blog for continued tips.
    Joanne Maly at Lincoln Maly Marketing in Cincinnati, Ohio

    • Lee Odden says

      May 1, 2009 at 5:45 am

      Thank you Joanne. As awareness builds and an appreciation for what needs to be done increases, so does the need for prioritization. I hope the ensuing series of posts about SEO for PR helps with that. 🙂

  4. Shertmann Lopez says

    April 30, 2009 at 11:28 am

    Another important factor to consider is that even though a website is 100% search engine friendly, has a intuitive anchor text optimized sitemap and a XML sitemap search engines might still not index all of the site pages. The main reason for this is that search engines crawl the web at an average rate per day for all the trillion indexed urls in their database, hence crawling and indexing a site with millions of pages will take lot more than 6 months to get all URls indexed as search engines break it down by phases. Sites this big are also required to have more than one XML sitemap per subdirectory to ensure an efficient crawling and indexing rate from search engines. One thing is for sure don’t expect search engines to crawl an entire site at once unless it is a small business or personal site. Furthermore, don’t misinterpret the fact that you don’t see all of your URLs indexed in search engine databases as it might very well be that search engines are crawling your site by phases and indexing pages accordingly.

    Cheers,
    Shertmnan Lopez

    • Lee Odden says

      May 1, 2009 at 5:44 am

      Thanks Sherman. I am not sure I’d underestimate Google’s ability to crawl a large site. Even so, there are few sites that have so many pages that Google would need expansive amounts of time to crawl. The priority is set higher if there are a substantial number of deep inbound links to the site. Registering with Google Webmaster tools will reveal what’s been indexed and what has not as accurately as any PR professional dealing with SEO would need to know.

  5. Michael Martinez says

    April 30, 2009 at 11:55 am

    A site that large and that old should not be getting most of its traffic from search. What I feel is missing from your example is a target level for search referrals. What’s the point where the search engine optimization is declared to be “successful”?

    If the site is relevant only to dead or dying query spaces, it cannot expect much traffic from search. Maybe 5% of site traffic is all it can expect. Some sites simply don’t evolve.

    I think the hypothetical scenario needs to be fleshed out.

    • Lee Odden says

      May 1, 2009 at 5:41 am

      An experienced SEO would be tempted to drill down into specifics of a hypothetical, but I think the point is lost in that Michael.

      The advice in this post centers around one of many potential search visibility issues that public relations professionals should be aware of. SEO is still so new to the PR industry, we’re building awareness first. The majority of PR people I talked to yesterday at a PR conference didn’t know what crawling or indexing issues were.

      Now that they do, they can tackle the specifics like the keyword demand issue you’re bringing up.

  6. Jason Baer says

    April 30, 2009 at 3:01 pm

    Lee –

    Fantastic post, and a great idea for a series. In so many way, PR and search are merging, and most PR practitioners just have never been taught SEO basics.

    I completely agree with site crawlability. Without it, no other efforts are going to pay off.

    Good stuff. I’m looking forward to reading the rest.

    Best regards,
    j

    • Lee Odden says

      May 1, 2009 at 5:36 am

      Thanks Jason. In his keynote, Brian Solis said the same thing regarding the convergence of disciplines. My motivation is to educated PR practitioners on the basics of SEO so they’re better empowered to win budget for SEO efforts, possibly bringing in outside resources like TopRank to do one time audits or ongoing work as needed. For many others, it will be the stimulus to invest in SEO resources in-house.

      Either way, the company wins with better search visibility.

  7. PalmPreReviewer says

    May 1, 2009 at 6:15 pm

    A crawlable site, so simple, yet so elusive. Thanks for the insight.

    Raza Imam
    “a die-hard Blackberry user with a soft spot for the Palm Pre”

  8. JD Johnson says

    May 18, 2009 at 2:24 pm

    Great Site! Thanks for the new views on the subject. It was extremely helpful.

    BusinessHQR.com

  9. kodinz says

    May 21, 2009 at 9:35 pm

    hi,

    I just come accross this blog today. Your post very usefeul for me. I have to read more about how to increase my PR. Thanks for your post. really help

  10. EDC says

    May 25, 2009 at 10:42 am

    As a SEO consultant for my business at internettoolsu.com I run into this problem all the time. Having many many pages that are not crawlable seems to be a common problem with website owners. I see this happen in particular with sites that are put up fairly quickly and when the spiders come in to index the pages they get lost in the navigation. I kind of look at this like people surfing and getting lost in the original intent and spiders do get distracted with how navigation menus are setup. My recommendation to this problem to individuals who getting started in creating a mega site is to make sure each page as it is being built is pinged or somehow linked so that the search engines know how to find it and get it indexed quickly before adding any new pages. Google seems to like a growing site, I don’t see good results for website owners that notify Google after a mega site is built. Slow and steady works the best. As for already established websites it becomes a lot of work getting those inner pages indexed.

  11. preor says

    June 20, 2009 at 4:42 am

    I just come accross this blog today. Your post very usefeul for me

  12. svenmathura says

    October 8, 2009 at 5:06 am

    Another important factor to consider is that even though a website is 100% search engine friendly, has a intuitive anchor text optimized sitemap and a XML sitemap search engines might still not index all of the site pages. The main reason for this is that search engines crawl the web at an average rate per day for all the trillion indexed urls in their database, hence crawling and indexing a site with millions of pages will take lot more than 6 months to get all URls indexed as search engines break it down by phases. Sites this big are also required to have more than one XML sitemap per subdirectory to ensure an efficient crawling and indexing rate from search engines. One thing is for sure don't expect search engines to crawl an entire site at once unless it is a small business or personal site. Furthermore, don't misinterpret the fact that you don't see all of your URLs indexed in search engine databases as it might very well be that search engines are crawling your site by phases and indexing pages accordingly.

    Cheers,
    Sven Mathura

  13. svenmathura says

    October 8, 2009 at 10:06 am

    Another important factor to consider is that even though a website is 100% search engine friendly, has a intuitive anchor text optimized sitemap and a XML sitemap search engines might still not index all of the site pages. The main reason for this is that search engines crawl the web at an average rate per day for all the trillion indexed urls in their database, hence crawling and indexing a site with millions of pages will take lot more than 6 months to get all URls indexed as search engines break it down by phases. Sites this big are also required to have more than one XML sitemap per subdirectory to ensure an efficient crawling and indexing rate from search engines. One thing is for sure don't expect search engines to crawl an entire site at once unless it is a small business or personal site. Furthermore, don't misinterpret the fact that you don't see all of your URLs indexed in search engine databases as it might very well be that search engines are crawling your site by phases and indexing pages accordingly.

    Cheers,
    Sven Mathura

LET’S GET SOCIAL

RSS Feed Twitter Facebook LinkedIn Marketing with Intent: The Future of SEO & B2B Search Traffic 2022 B2B Influencer Marketing Research Report Elevate B2B Marketing Podcast

Learn about:

B2B Ignite USA 2023

SUBSCRIBE        

TOPRANK BLOGGING TEAM

TopRank Marketing Blogging Team

 

Optimize

RECOMMENDED RESOURCES

SEO Blogs

MARKETING BLOG RECOGNITION

CMI

Copyright © 2023 · TopRank Marketing

Return to top of page