Wired ran an interesting article recently about how the U.S. Government is using topic maps to provide context for words that have multiple meanings. This would help infer meaning in examples like: Mustang car or Mustang horse?
This is the challenge for search engines as well. Clustering tools like those offered by Vivisimo’s Clusty search engine appear to be quite effective, but not widely adopted. A search on “Mustang” clusters groups of search results according to topics like “Parts”, “Horse” and a town in Oklahoma. This is a lot more helpful than a similar search on Google which displays mostly car web sites in the first page of results.
I do believe search engines are moving away from an emphasis on link citation and on-page keyword representation to rank pages towards achieving a better understanding of the context in which information and documents are used. Whether it’s achieved through pre-made indices like topic maps or on the fly categorization/clustering, search engine users will respond to the best search experience.
Darin Babin gave a good example at the WW Search Conference of contextual relevance for on-page optimization, that makes sense in regard to search engines moving away from current ranking methods. Basically his comments described the exercise of optimizing a page for a particular phrase, then remove that phrase from the document text. A person should be able to guess what the missing text is based on the context in which it’s used. He was speculating that search engines will algorithmically rank pages based on contextual relevance and not so much on linking and keyword usage.
Personally, I don’t think link citation will ever go away completely and I hope the trend towards multiple primary search engines continues. Yahoo and MSN definitely making huge improvements and in the end the consumer will win with multiple, relevant, high quality search options.