I was very torn between attending this session and Big SEO, but I got a bit of both. Here’s the official session description:
“Today’s search landscape is more diverse than ever. There are hundreds of new social bookmarking, community tagging, feed search and news search sites and systems emerging. This panel has representatives from some of the top social and feed search engines on the web today. These expert panelists will look at the top issues they are currently facing, as well as new options they can offer webmasters who are looking to get their sites listed.”
This topic could easily have been broken down into at least 2 different sessions as these are somewhat disparate topics and no one really talked about blogs per se. Speakers included: Owen Byrne from digg, Rick Klau from FeedBurner, Chris Tolles from Topix.net and Niall Kennedy.
First up was Nial Kennedy who gave a fairly technical look at RSS feeds and their capabilities.
Feeds offer highly structured data and are discoverable by search engines. New browsers are giving RSS even more exposure to mainstream internet users with IE7, FF 2.0, Opera 9 all capable of detecting RSS feeds referenced in a web page.
Elements in creating an RSS feed:
– Link alternate
– Entry title
– Entry summary
– Entry author
Base Feed Vocabularies:
Extended vocabularies include calendar items, ecommerce and array of other “packaging” of content.
After you publish you want to make sure you ping, sending out a signal to RSS feed search services that an update has been made.
Check for errors in your feed:
Claim your feed:
Yahoo Site Explorer
Nail says the benefit of “claiming” your feed is that you’re climbing up a bar. ie, of all the feeds out there, a small number claim or validate and by doing so, it may make your site/feed more credible.
Subscribe to your feed:
– Ask Blogs and Feed Search and MyYahoo will not include your feed unless people subscribe
Watch for masked links:
Cross domain 302 – when you use Technorati for your feed provider, links within the feed actually redirect via 302 from a feedburner url to the destination. this is so Feedburner can track item views. An alternative would be a cname a DNS entry to a subdomain. (Note Feedburner already offers this functionality)
Add author data:
name, email, URI – enables author level search and rank
Next up was Rick Klau from Feedburner. I had a chance to talk to Rick after the session and Feedburner has some interesting enhancements in store regarding the use of the Blog Beat analytics service they purchased earlier this year and the addition of more data in the RSS reporting.
Feedburner hosted feeds Reach 25 million people a day and they are the largest provider of RSS hosting services. Rick thinks IE7 is the beginning of mass RSS adoption.
Evolution of feeds:
2003: It was pretty much Blogs and RSS
2006: The use of feeds has extended from blogs and RSS to podcasts, video, ecommerce, web services, (online print and broadcast media)
One example of how feeds are being used is cheap deals retailer offering a RSS feed on closeout items. The reasons for creating and consuming feeds is far more varied.
- IE7, Firefox 2.0
- Social services: digg, Google reader shared feeds
- Meme tracking is another discovery mechanism
- Style sheets and browsers are helping feed usability – Raw RSS code is going away
- Edgeio – decentralized classified service – Will index your content, make it searchable and drive traffic back to your site
- Sphere – blog search engine
It’s very important to ping – shows Feedburner Feedshot service or use pingomatic.com
Add functionality to your feeds:
Other things that will engage users to do things with your content. He shows Feedburner functionality that enables a feed to show an invitation to digg a feed entry and show how many diggs/comments have already been made.
Clickthrough tracking is optional with Feedburner and you can use a 301 or 302.
Note from Lee: I’ve noticed from speakers at other conferences who talk about RSS feeds there is a misconception about being able to use a 301 redirect to a subdomain in the Feedburner toolset that offers SEO benefit. Feedburner has been good about responding to these claims by enabling the 301 redirect as well as the use of your own domain name as the feed url. However, a sub domain is seen by search engines as a different site than your main domain name. ie: feed.domainname.com is NOT the same thing as www.domainname.com/feed. Feedburner allows you to redirect to feed.domainname.com and there is some branding value there, but little if any SEO benefit. For SEO benefit, you would need to be able to redirect to www.domainname.com/feed and that is not possible with the current Feedburner system or any 3rd party hosted RSS service.
Now back to the presentation:
Know which services know you – bots. Feedburner knows 3000 feed aggregators and several hundred bots. It can be useful to know which bots are crawling your feed.
Next up is Owen Byrne, co-founder of digg. Owen was a freelance web developer hired by Kevin Rose who presented the idea of having a news site where users were the editors. “wisdom of crowds”
Live in December 2004
Paris Hilton phone hack Feb 2005 – doubled traffic to digg, doubled again next week
Version 2 July 2005
Version 3 June 2006
digg is a democratic process, no editors. Members are extremely vocal and motivated: vote down spam, control home page, top stories are what’s important to users.
Other digg factoids:
500,000 registered members
4000 stories submitted daily
Shows diagram: “digg” effect much larger than “slashdot effect”
How Digg has scaled:
- Process for multiple developers
- Technical and mgt issues
- Avoid premature optimization
- Cache, cache and more cache
- Hardware is cheap downtime is not
- Lots of servers, spareds monitoring, testing, developing
digg has 90+ servers for both a production and development environment
Allows 3rd party sites to enabled digg this story submission bvuttons
Wow, so the tipping point for digg was Paris Hilton’s cell phone getting hacked! Hmmm this makes for an interesting marketing idea.
Last up is ex DMOZer Chris Tolles, co-founder of Topix.net
Topix is the only service that offers news by zip code. Local news was generating 60% of traffic with only 10% of content.
Topix ranks in the search engines on many permutations of geographic location + “news” – fresh content ranks well!
Topix offers 360,000 feeds.
Topix traffic was leveling so they added a forum. They found people wanted to react and respond to news. This boosted traffic significantly. The magic in increasing visitors is providing interactivity. The Topix forum went from 0 – 16,000 comments/day in 4 months.
Fresh content for your site
Pull instead of push
RSS feeds to increase distribution of content (1/3 of story clicks for Topix are via RSS)
Topix is #25 on the comScore list of top news and information sites, so improving interaction and distribution can have very positive effects.
Topix started as an information pubisher site, then added interactivity and RSS and traffic has boomed.
At this point I went over to the BIG SEO session to catch the Q/A.