I had some distractions earlier in the day and did not get to blog any of the pre-lunch sessions. So here’s the first post-lunch session: Links Q/A with Search Engines moderated Detlev Johnson from PositionTech. This session is a combination of short presenations followed up with Questions and Answers.
Speakers include: Kaushal Kurapati – Ask Jeeves, Charles Martin – Google, Tim Mayer – Yahoo, Ramez Naam – MSN.
Detlev announces a part this evening sponsored by WebmasterRadio, Bruce Clay, PositionTech, TrueLocal and NY Times. This is kind of funny: Tim Mayer takes a photo from the podium of Barry Schwartz blogging from the front row.
First up was Kaushal Kurapati from Ask Jeeves to talk about how links work with Ask Jeeves.
Links from high quality pages are best.
How Ask works:
1. Search the index to collect and calculate global information
2. Break the index into communities
3. Collect and calculate local subject specific information.
4. Apply all pertinent global and local information
What this means is get popular in your subject area, not global.
Be cautious of buying links. Overnight surge in link popularity is a red flag. Do not engage in reciprocal links. Avoid link farms, artificial link popularity. It’s easy for search engines to lose trust in a site, but difficult to get it back.
What to do:
Link to authority. Become an authority on a subject.
Ensure the site architechture allows the site to be spidered and that the content is search engine friendly.
Next up is Charles Martin from Google, a Software Engineer who works in search quality area. This is his first speaking gig at SES.
Links are a proxy for human judgement.
Relevance = PageRank + Hypertext Analysis
– PageRank algorithm rates reputable pages higher
Links are votes for your page.
When you put your links together, try to "channel the user", "be the user". Use Anchor text that is meaningful to the destination web page.
People linking to you:
Encourage related sites to link to you
Use unique relevant content to attract links
Avoid reciprocal links
Linking to others: You can’t control who links to you, but you can control who YOU link to.
Link to sites that are useful to users. Link deep.
Where to put links:
- Where useful to users
- Cross browser
- Static text links best
- Dashes not underscores
- Not too many paramerters
- Short and legible
Backlink Obsession: Every hour you spend obsessing on backlinks an not on good content, is an hour wasted.
Shows Google sitemaps that has a tool that shows how many urls are crawled and any errors.
Next up is Tim Mayer from Yahoo.
Link building is overemphasized. Great content will get links organically. Make tools and useful resources and people will link to it.
Link popularity has been a strong ranking feature for a while, and Yahoo is looking at ways to get away from that. Yahoo has been looking at social search, communities, and tags as a way to rank paegs.
- Make links related to content. Help user to find to users related information.
- Add unique and useful content that invites others to link to your site
- Use appropriate and specific anchor text to describe linked to content. (NOT "click here")
- Don’t use link exchanges or buy links
Yahoo Site Explorer – You can access pages and inbound links to a site. Site explorer was created for the SEO audience.
Added internal link filter to LINK: command
RSS and Atom feed submissions report – this augments the multiple url via txt file and single url submit
Tim shows Search Information and Contacts page – to get a site reviewed for reinclusion. Don’t link unrelated content.
Last up is Ramez Naam from MSN Search.
Why do search engines use links?
1. The link structure of the web helps SEs understand the
2. Links are good lables, "anchor text"
If you engage in tactics that do not reflect the user’s benefit you are at risk.
Q/A with Search Engines:
Question from Andrew Goodman: What would you do to distinguish intentional Google bombing.
Tim (Yahoo): If there’s a 1000 links coming into a site and the anchor text is all the same, that’s not human effort. You should be asking for links and the linking site picks the text.
Kaushal (Ask): It’s fairly easy to detect most manipulations.
Question from Barry: Tim said there’s too much focus on linking and not on content. Why have the other search engines not build similar tools?
Tim (Yahoo): Site Explorer is for SEOs, to reduce the query load on their front end.
Question: Has a content site, 10,000’s of pages and wants to do more deep linking. Will those links deep benefit up and across?
Rameez (MSN): If your deep pages link to each other there will be some benefit to them when linked from external pages.
Charles (Google): Getting deep links is not a problem. What could be a problem is of all the pages of your site link to each other will make the site look like a blur.
Question: For the most part, no canonical problem. Most links go to main url. However, they do use newsletters, RSS feeds, etc with variables. Those urls get picked up by other sites. Will a link to those dirty urls pass value?
Tim (Yahoo): Keep it simple. Try to let the SE know that it’s the same page.
Charles (Google): Sounds like your content page would have 2 urls. One plain, one with variables. You can redirect the variable url to the static url to get. Says that the site will get credit for the vote either way.
Question: What’s the best way to let the search engines know about new content.
Ramez (MSN): If you have good content, people will link to it organically.
Tim (Yahoo): Get a site that is already indexed to link to you.
Kaushal (Ask): Get into DMOZ and crawler friendly.
Charles (Google): Best way to get links is organically – through blogs, etc.
Detlev (Moderator): A site can be indexed pretty quickly. I had a new site (SearchReturn.com) indexed quickly by getting linked by a blogger in this room. Detlev is looking at me when he says this and we’ve traded emails about it, but Barry might have also linked to him.
Question: It’s been mentioned to keep query string values small, but how small?
Tim (Yahoo): When a search engine looks at your site, url length is an issue when it has a lot of variables because it’s likely to be a duplicate. Making urls look static, will make the search engine more confident it’s a good url. Also limit the number of directory levels to 3 or 4.
Question: With the Google sitemap, is there a limit?
Charles (Google): Yes there are limits. 50,000 urls, 50 site maps. However, the current beta may not support that many. Look to the online documentation for exact numbers.
Question: For a niche category, once you’ve exhausted relevant links, where else can you get links? Tim (Yahoo): Geographic, but be careful of creating a link network.
Question: Do you penalize when another (bad/spam) site points to me?
Tim (Yahoo): You can’t control who points to you. But, bad sites often link to bad sites. We’re not going to ban you, but make sure your site is clean. You shouldn’t get banned, but if you do – you can get your site reviewed.
Charles (Google): If bad sites link to you, hold fast.
Question: Insurance company – agents all have web sites.
Ramez (MSN): They do look at how many sites per domain, per ip.
Question: Is there a penalty for using the add url form?
Ramez (MSN): No
Tim (Yahoo): No
Charles (Google): I would be stunned to hear using the submit url form would be poorly received.
Question: You say too many links too fast. How many is too many and how fast is too fast?
Charles (Google): If you get a lot of links all with the same anchor text, that’s the issue.
Tim (Yahoo): It depends on the industry. Look at other sites in the same industry and how many links they have.
Question: Are other search engines looking at adding a feature like Google SiteMaps
Tim (Yahoo): Yahoo has a multiple url submit option.
My battery died again. This won’t happen again! Only 3 more questions or so. Barry also covered this session.